Ryder et al 2010 defining best available science

From Salish Sea Wiki



Ryder, D.S., M. Tomlinson, B. Gawne, G.E. Likens. 2010. Defining and using 'best available science': a policy conundrum for the management of aquatic ecosystems. Marine and freshwater research, Issue 61, pp. 821-828.

Notes[edit]

The authors make the following claims:

  • BAS is intended to engender credibility but in Australian policy reviewed "none of these references or uses includes an explicit definition of the properties and standards of BAS, or guidance on its practical application in the decision making"
  • BAS hinges on three concepts: what is science, how do we determine quality of science, and what makes science available?
    • Science is self-correcting over time, but generates inaccurate and imprecise information in the interim, and thus may be mismatched to policy practice.
    • Doremus 2004 differentiates between 1) research science, 2) courtroom science, and 3) regulatory science.
    • Bisbal 2002 differentiates between 1) scientific information (emerging from observation and hypothesis testing), 2) suggestive information (empirically rich observations that do not test mechanisms), and 3) supplementary information which is informed or expert opinion.
  • RISK, UNCERTAINTY and COMMUNICATIONS are key challenges to successful integrative management.
    • understanding "discipline-based knowledge structures".
    • articulation of uncertainty and risk.
    • promotes engagement and trust among generators and users
  • It requires "skills and courage to apply the best science available and not wait for the best science possible."
  • Social processes in decision making can limit use of scientific evidence. Policy makers and scientists are different groups.
  • This is why explicit rules to guide critical evaluation of scientific evidence are so important.
  • Brennan et al 2003 suggests information collected using established protocols, properly analyzed, and peer-reviewed before public release.
  • Availability is driven by information providers rather than policy makers.
  • UNCERTAINTY & RISK
    • "Direct relevance" is identified as a characteristic.
    • Alternative explanations are the norm.
    • "It is crucial that assumptions and inherent uncertainties are made explicit and are captured in any further reporting or interpretation so that the confidence levels and risks associated with knowledge gaps, generalisations and extrapolations are clearly identified." p.824
    • The author does not strongly resolve or recommend how to structure uncertainty, but mentions "Bayesian statistics", and "information-gap theory".
  • COMMUNICATION
    • "A recurrent theme in this literature is that the success of interdisciplinary science hinges on effective communication. Communication in interdisciplinary programsis not a simple or trivial objective."
    • "Investment in communication creates shared ownership and support for multiple program goals by all project participants"
  • PROPOSED PRINCIPLES (p.826)
    • "Create and support a cooperative process that enables interdisciplinary teams to produce shared knowledge that meets the needs of all users
    • "Articulate a clear management or policy question and translate it into research questions and supporting hypotheses
    • "Define the knowledge needs in terms of its properties (scientific, supporting and indicative)
    • Create an a priori and case-specific hierarchy of ‘best’ information (well-established theories, peer-reviewed published and unpublished literature, expert opinion)
    • Develop study designs and analyses that are appropriate for the hypotheses being tested
    • Clearly state assumptions, define terms, and identify uncertainties and associated risks
    • Build in revision as uncertainties, limitations and inconsistencies are addressed over time
    • Ensure a record exists of the decision-making process
    • Communicate research methods, supporting rationale, results and management applications via the peer-reviewed literature and through reports or other formats as preferred by the management and policy audience
  • "We suggest the use of a ‘best evidence synthesis’ (sensu Slavin 1995), where an interdisciplinary group addresses a defined question using the principles of BAS outlined above. This synthesis should outline a ‘bigger picture’, facilitating an understanding of discipline-based knowledge structures (sensu Benda et al. 2002) that articulates uncertainty and risk about the information and its potential for unexpected or inequitable effects, ensures an objective and transparent process, and promotes engagement and trust among the generators and users of information, and the general community that must live in landscapes affected by these decisions."