Practitioners and policymakers working in environmental arenas often make decisions that have large impacts on ecosystems and the benefits that people get from nature. Using existing evidence about the effectiveness of conservation interventions can maximize the success of environmental management, and mitigate potential detrimental outcomes of actions that have proven unsuccessful. Governments have frequently made scientifically‐unsound decisions regarding natural resource exploitation (Carroll et al., 2017; Westwood, Walsh, & Gibbs, 2017). Ensuing controversy means that some are now putting in place mechanisms to ensure that evidence can be used (Westwood et al., 2017). Evidence‐informed environmental management has received increasing attention in recent decades (Cook, Nichols, Webb, Fuller, & Richards, 2017; Walsh, Dicks, Raymond, & Sutherland, 2019), and several barriers to evidence use have emerged (Lemieux, Groulx, Bocking, & Beechey, 2018; Rose et al., 2018; Walsh et al., 2019). One key barrier is the time‐consuming process of locating and accessing primary literature, and summarizing the knowledge effectively (Li & Zhao, 2015; Pullin, Knight, Stone, & Charman, 2004). Evidence syntheses—research papers that critically review, collate, and summarize available knowledge on a specific topic—are designed to lower this barrier.
A primary advantage of evidence synthesis is they highlight and take into account the risk of bias compared to individual studies or less structured discussions of existing literature (Boyd, 2013). The critical appraisal of findings is an important stage in many forms of evidence syntheses (e.g., systematic reviews), although a wide range of available methods can be adapted to different contexts (Pullin et al., 2016), with each synthesis type having varying strengths and weaknesses (Cook et al., 2017). For example, systematic reviews aim to be rigorous and replicable by following a standardized methodology (Collaboration for Environmental Evidence, 2018; Johnson & Hennessy, 2019). If high confidence is needed to evaluate a specific hypothesis, such as the likely effectiveness of an intervention, then systematic reviews are often the most suitable synthesis method (Haddaway & Pullin, 2014). However, they are resource‐intensive and environmental questions often suffer from a lack of studies that meet the criteria required for inclusion. Conversely, rapid reviews are less reliable but more useful under narrow time constraints (Cook et al., 2017; Dicks et al., 2018; Webb et al., 2017). If the aim is to generate hypotheses by representing the evidence for relationships in a system, conceptual models are an applicable form of evidence synthesis (Cook et al., 2017). Ideally the results are communicated in an accessible way, targeted to decision‐makers (Bilotta, Milner, & Boyd, 2014). Indeed, research shows that nearly all practitioners are willing to reconsider their management choices when provided with a conclusive summary of the primary literature on a topic, and these altered choices can increase the effectiveness of management (Walsh, Dicks, & Sutherland, 2015).
While evidence syntheses are often used to inform decisions in medical fields (Lavis et al., 2005; Mays, Pope, & Popay, 2005; Thomson, 2013), the use of formal evidence syntheses has only recently become widespread in environmental spheres, and it is unclear how they are perceived, how well they are understood, and how frequently they are used by environmental practitioners and decision‐makers (Bennett, 2016). Here we present perceptions and attitudes of senior Canadian experts with extensive experience at the science‐policy interface about the current level of acceptance and use of evidence syntheses in environmental decision‐making, and which forms of synthesis they find the most reliable.
Canada is a country with a developed economy and is rich in natural resources (Cooke et al., 2016). The second‐largest country in the world by landmass, it comprises 10 provinces and three territories with highly decentralized environmental regulation and management (Cooke et al., 2016). Legislation is enacted at the federal, provincial/territorial, Indigenous government, and municipal levels. The federal entities most relevant to natural resource exploitation and environmental management in Canada include Fisheries and Oceans Canada (DFO), Natural Resources Canada (NRCan), and Environment and Climate Change Canada (ECCC), along with Parks Canada, a federal agency within ECCC. Civil society non‐government organizations (NGOs) such as environmental organizations or hunting and fishing organizations also work to inform and shape natural resource and environmental management regimes.
From June to September 2019 we conducted a study on the capacity of environmental research to inform policy and practice in Canada. We interviewed 84 environmental experts across different sectors. We recruited participants who are currently used in or recently retired from high‐level positions in environmental bodies in Canadian federal, territorial, or provincial governments or NGOs with an interest in advising or influencing policy. We aimed for a diversity of perspectives, with good representation across gender (36 female, 48 male), sector (federal, territorial or provincial government, NGO), and federal entities (DFO, ECCC, Parks Canada, NRCan). Participants had a range of experience from 8 to 30+ years in the field (not including years in academic graduate programs). Further details about sampling methods can be found in (Nyboer et al, 2021, in review). This research was approved by the Carleton University Research Ethics Board (file #12486).
Interviews were semi‐structured, following a set of scripted questions but allowing for digressions (Longhurst, 2010; guide in Supporting Information S1). They comprised a mix of closed‐ended and open‐ended questions, generating both quantitative and qualitative responses. The questions covered a broad range of issues related to how research informs policy. One section (Q6) asked interviewees to identify the primary forms of evidence they used to inform policy decisions, with options including peer‐reviewed literature, primary non‐peer‐reviewed literature, raw data, theory, policy briefs, Indigenous knowledge, and evidence syntheses. Another section (Q14) focused just on evidence syntheses. We asked interviewees to select the top three evidence syntheses types that they (a) have confidence in, and (b) actually use from the list in Table 1, and to explain their choices. They were not asked to describe each syntheses type, but the below definitions (Table 1; adapted from Cook et al., 2017) were available to them. We also put in place measures to reduce bias (see Supporting Information S2).
TABLEDescriptions of different evidence syntheses types shown to intervieweesSynthesis type | What it does |
Causal criteria analysis | Tests specific cause‐effect hypotheses |
Conceptual models | Depicts the current knowledge of relationships within a system |
Narrative/traditional review | Provides a qualitative review of the literature on a particular topic |
Rapid review | Provides rapid evaluation of evidence to test a hypothesis |
Stand‐alone meta‐analysis | Combines multiple, comparable studies to test a hypothesis |
Summaries and synopses | Summarizes the evidence‐base for a broad management area |
Systematic map | Describes the state of knowledge for a particular topic |
Systematic review | Provides a transparent, repeatable, and quantitative evaluation of the evidence for a hypothesis |
Vote counting | Summarizes the evidence for and against a hypothesis |
None | Not familiar with any of these |
We conducted both qualitative and quantitative analyses. Responses to closed questions were entered in a database along with the participants' affiliation (i.e., federal government entity, provincial or territorial government, or NGO). We examined the frequency of responses of all respondents together, and separated by sector. In addition, all 84 interviews transcripts were analyzed for references to evidence syntheses, using a description‐focused coding strategy. The codes emerged in an iterative process, as we cycled repeatedly between reading, focused coding, reflection, and rereading (Adu, 2019; Tie, Birks, & Francis, 2019). The codebook we developed is available in the Supplementary Information (Supporting Information S3). All coding was conducted by a single author (LTW) in NVivo 12 Pro (QSR International).
Due to time constraints during some of the interviews, only about 50% of all respondents (n = 39–40) were able to complete the closed‐ended questions; however, all sectors were still represented (Table 2). Further, the qualitative analysis identified references to evidence syntheses in 42 of the transcripts. The gender split between respondents was 18 female and 24 male.
TABLENumbers of participants from the federal government departments, provincial and territorial governments, and NGOs from closed‐ended questions plus transcript codingOverall sample sizes | Quantitative analysis | Qualitative analysis | |
Confidence | Use | ||
Federal government entity | 20 | 22 | 25 |
Parks Canada | 5 | 6 | 8 |
Environment and climate change Canada | 6 | 6 | 6 |
Fisheries and oceans Canada | 6 | 6 | 7 |
Natural resources Canada | 3 | 4 | 4 |
Provincial/territorial governments | 7 | 7 | 6 |
Alberta | 1 | 1 | 1 |
British Columbia | 1 | 1 | 1 |
Nunavut | 2 | 2 | 1 |
Northwest Territories | 2 | 1 | 2 |
Ontario | 1 | 1 | 1 |
Yukon | ‐ | 1 | ‐ |
NGOs | 12 | 11 | 11 |
Canadian council of academies | 1 | 1 | 1 |
Canadian parks and wilderness society | 2 | 2 | 1 |
Evidence for democracy | 1 | 1 | 1 |
Great Lakes fisheries commission | ‐ | ‐ | 1 |
Nature united | 1 | 1 | 1 |
Trout unlimited | 1 | 1 | 1 |
Waterton biosphere reserve association | 1 | 1 | 1 |
Wildlife conservation society Canada | 1 | 1 | 2 |
World wildlife fund Canada | 1 | 1 | ‐ |
Yellowstone to Yukon conservation initiative | 2 | 2 | 1 |
Yukon conservation society | 1 | ‐ | 1 |
Total | 39 | 40 | 42 |
Interviewees generally viewed evidence syntheses positively and indicated enthusiasm for their use in decision‐making processes. Thirty‐eight percent (n = 29 out of 76 respondents who answered Q6) selected syntheses as one of the primary forms of evidence they used in their work, third after the primary peer‐reviewed and non‐peer reviewed literature (Figure 1). It is worth noting that a classification system for evidence types is by necessity imprecise, and evidence syntheses can incorporate many of the other types listed. Although it may seem problematic that evidence syntheses came third, this question is specifically about which evidence type is commonly used rather than valued. The qualitative analysis suggests that syntheses are highly trusted, but lack of availability hampers actual use.
The qualitative analysis revealed a common theme for the favorable attitudes toward evidence syntheses. Many respondents viewed scientific results as more trustworthy if multiple studies came to the same conclusion, which can be demonstrated in a synthesis. It is also easier and quicker to read an evidence synthesis than multiple primary articles, offering a convenient way to keep up to date with the wider research context.
Sure I can do the job of trying to read a thousand studies and summarizing them but I would much prefer to have someone who I can trust from a good school who writes clearly that's reflecting on a lifetime of experience.—R03 But I think good science is maybe the science that can be described with multiple studies and we can actually see that these phenomenon or trends are observed in more than one case and that builds confidence that, yes, this is something that is real. Not to say that if you only see something once it's not real ‐ it probably calls for some more work. But building the evidence is maybe one way to be actually more comfortable or to have more confidence in its accuracy.—R04 (NGO)
I personally tend still towards the academic peer review with relative confidence or these bigger syntheses that have been done through collaborative networks of experts with relative confidence.—R18 One of our realities in government is just about anything we do is subject to judicial review. So there is a formal process. Everything major or very significant. I mean we usually have a briefing note system, essentially where the primary findings are summarized in a briefing note that's then sent through for decision with a summary of the analysis and conclusions that are leading to that decision that then gets signed off. So that's kind of our main place where we document the basis for our decisions.—R17 (Parks Canada)
We presented participants with nine different synthesis types (Table 1). There was some discrepancy between the top three synthesis types interviewees reported having confidence in, and the three they actually used (Figure 2). Most respondents trusted meta analyses (64%) and systematic reviews (79%), which are often perceived to be among the most rigorous forms of evidence syntheses (Bilotta et al., 2014; Cook et al., 2017; Walsh et al., 2019). However, the use of these synthesis types was much lower, at only 33 and 55%, respectively. The most frequently used type was summaries and synopses (62%).
2 FIGURE. Percentage of respondents who selected different evidence syntheses types as one of the top three they (a) had confidence in and (b) used, both overall (bars) and by sector (shapes)
Now which ones would I have greater faith in? Definitely, probably definitely the systematic review, would be one of the biggest I like. And because that gives you a really good idea of repeatability. Make sure we're not missing anything and you've got the evaluation. You can see where the evidence comes from.—R22 It depends what's available. I think it's based on availability not based on confidence. So given that, it's probably more often, narrative traditional reviews. Probably more.—R08 (Parks Canada)
We almost never have a review at hand that we can use.—R38
Our research suggests that many professionals working at the science‐policy interface in the environmental realm do not need to be convinced of the merits of rigorous evidence syntheses. Syntheses are viewed as useful tools for senior decision‐makers and their advisors, saving time and highlighting sources of inaccuracy compared to rushed or superficial searches of the primary literature. Indeed, our data show that if high quality and relevant evidence syntheses exist, and are appropriately communicated and shared, they will be used and embraced. Although our findings reflect responses of only highly experienced environmental policy experts and could thus potentially overlook individuals with less favorable views of evidence synthesis, this is an encouraging finding. In addition to building support for the approach, proponents of the use of evidence syntheses in environmental management should instead focus on increasing the number of syntheses, improving their timeliness and relevance to environmental issues, and facilitating their availability to decision‐makers. Below we outline concrete pathways to achieve these goals.
As the volume of available scientific information continues to expand and environmental decisions become increasingly complex, the need for evidence synthesis is growing. The capacity to deliver them and ensure their accessibility should follow suit. This reinforces the usefulness of dedicated evidence synthesis organizations like the Collaboration for Environmental Evidence (CEE;
As a country, Canada has both espoused evidence‐based government decision‐making and been challenged to adhere to it (Magnuson‐Ford & Gibbs, 2014), and within governing bodies the capacity for conducting an environmental evidence synthesis is currently limited (Carroll et al., 2017). Dedicated centers, such as the Canadian Centre for Evidence‐Based Conservation (
Better coordination among synthesis groups and potential end‐users could help prioritize future research questions and ensure that outputs are used. A useful approach could be to conduct a horizon scan similar to those that have been conducted for global conservation issues (e.g., Sutherland et al., 2019), but focused instead on common policy questions that would benefit from a synthesis. This may also include topics where there are many individual studies but few syntheses; a way to point out the proverbial “low‐hanging fruit” in the world of evidence synthesis. Additionally, end‐user groups could publish priorities for evidence synthesis based on recurring management issues where staff struggle to find clear evidence to guide decisions (e.g.,
Finally, an important area of study is to explore the root of the perception that appropriate evidence syntheses are not available. It was unclear from our interviews whether relevant syntheses are truly nonexistent, whether they exist but are spatially or temporally inexact (e.g., different geographic location), whether decision‐makers are simply unaware of their existence, or whether they are aware but unable to access them. Instances in which evidence syntheses have been successfully used to inform policy and practice should be analyzed to understand the characteristics of syntheses that lead to impact, for example, in terms of spatial scales or specificity versus generality. Such studies would clarify how decision‐makers assess the applicability of syntheses, and can illuminate how to effectively mobilize evidence syntheses to ensure their uptake by practitioners and managers. Again, the work of dedicated evidence synthesis centers is crucial here. As well as supporting the production of syntheses and compiling them in accessible databases, they could provide guidance on dissemination. Syntheses are intended to be used by diverse audiences, so concise and effective communication is crucial. As one respondent summarized:
At the end of the day it's actually people in my position at that science policy interface, where I have to go in and I have eight slides and maybe five minutes to communicate someone's 10‐year academic findings. And so, synthesis maps, synthesis figures, synthesis images, I use those more than anything else.—R10
Funding was provided by NSERC via an NSE Grant to S. J. C. Additional support was provided by Carleton University via the Multidisciplinary Research Catalyst Fund.
There are no conflicts of interest to disclose.
Elizabeth A. Nyboer, Steven J. Cooke, and Nathan Young conceived the study; Steven J. Cooke, Elizabeth A. Nyboer, Nathan Young, Trina Rytwinski, and Jessica J. Taylor designed the methodology; John F. Lane, Elizabeth A. Nyboer, Nathan Harron, collected the data; Laura Thomas‐Walters analysed the data and led the writing of the manuscript. All authors contributed critically to designing the interview tool and to writing the drafts. All authors gave final approval for publication.
Our research participants are identifiable from the qualitative data so it cannot be made public, but the quantitative survey data can be found in anonymised form at 10.6084/m9.figshare.14294558.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2021. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Practitioners and policymakers working in environmental arenas make decisions that can have large impacts on ecosystems. Basing such decisions on high‐quality evidence about the effectiveness of different interventions can often maximize the success of policy and management. Accordingly, it is vital to understand how environmental professionals working at the science‐policy interface view and use different types of evidence, including evidence syntheses that collate and summarize available knowledge on a specific topic to save time for decision‐makers. We interviewed 84 senior environmental professionals in Canada working at the science‐policy interface to explore their confidence in, and use of, evidence syntheses within their organizations. Interviewees value evidence syntheses because they increase confidence in decision‐making, particularly for high‐profile or risky decisions. Despite this enthusiasm, the apparent lack of available syntheses for many environmental issues means that use can be limited and tends to be opportunistic. Our research suggests that if relevant, high quality evidence syntheses exist, they are likely to be used and embraced in decision‐making spheres. Therefore, efforts to increase capacity for conducting evidence syntheses within government agencies and/or funding such activities by external bodies have the potential to enable evidence‐based decision‐making.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Details

1 Canadian Centre for Evidence‐Based Conservation, Department of Biology and Institute of Environmental and Interdisciplinary Science, Carleton University, Ottawa, Ontario, Canada; Biological and Environmental Sciences, University of Stirling, Stirling, Scotland, UK
2 Canadian Centre for Evidence‐Based Conservation, Department of Biology and Institute of Environmental and Interdisciplinary Science, Carleton University, Ottawa, Ontario, Canada
3 School of Sociological and Anthropological Studies, University of Ottawa, Ottawa, Ontario, Canada
4 Institute of Environmental and Interdisciplinary Science, Carleton University, Ottawa, Ontario, Canada
5 School of Public Policy and Administration, Carleton University, Ottawa, Ontario, Canada
6 Conservation Science, Canadian Wildlife Federation, Kanata, Ontario, Canada
7 Conservation Science, Yellowstone to Yukon Conservation Initiative, Canmore, Alberta, Canada
8 Conservation Programs Branch, Parks Canada, Gatineau, QC, Canada
9 Wildlife Research Division, Environment and Climate Change Canada, National Wildlife Research Centre, Ottawa, Ontario, Canada
10 Great Lakes Laboratory for Fisheries and Aquatic Sciences, Fisheries and Oceans Canada, Sault Ste. Marie, Ontario, Canada
11 Environment and Biodiversity Sciences Branch, Fisheries and Oceans Canada, Ottawa, Ontario, Canada; Environmental Change and Governance Group, University of Waterloo, Waterloo, Ontario, Canada