Skip to main content

Agricultural Education and Communication

Agricultural Education and Communication

AEC horiztonal logo

AEC horiztonal logo

Ethics of Expertise

a project by Dr. Sadie Hundemer, funded through Archer Early Career Seed Grant

Assessing and developing an ethics of expertise

Scientists are expected to attend to two fundamental responsibilities: objectivity and public impact. They are expected to be objective so that their work can be broadly trusted. However, to be relevant, scientists are also told to conduct research that serves the public interest[1]. While objectivity suggests a need for scientists to be value-neutral, public impact requires them to directly engage values. Thus, there can be tension between responsibilities such that attending to one can compromise the other[2].

When surveyed, scientists report that they prioritize both objectivity and public impact, but such assessments typically examine the responsibilities independently[3]. They do not consider the tradeoffs between objectivity and public impact that scientific work requires. As a result, we do not know how scientists navigate these tradeoffs. Likely, scientists themselves do not know. Given the scarcity of training on the topic[4], scientists are likely making major value decisions without realizing they are doing so.

At a time when trust in science is threatened and when science is increasingly central to public decision making, it is imperative that scientists make conscious, defensible decisions about the application of values in their work. Failure to do so risks scientists’ professional credibility and the impartiality of the guidance they provide[5]. Our research group is identifying and analyzing the manner in which scientists perceive and act upon their responsibilities to objectivity and public impact – their ethics of expertise. This research will provide scientists with tools to examine their ethical perspectives and provide institutions with the information needed to prepare scientists for ethical decision making.

The tension between the ideals of impartiality and public impact

Science is often described as objective, or at least as more objective than other ways of knowing, and that perception of objectivity grants scientific disciplines a privileged (if diminishing) position in society. Yet science is not and cannot be value neutral because it is an outcome of human endeavor, situated in the lives of individuals and the cultures of institutions. To be truly objective, science would have to “viewed from nowhere,” that is, from no particular perspective[6]. But scientific observations are always oriented in some manner. Scientists with different paradigms or who use different theories observe the same things in different ways[7]. As a result, seemingly benign acts such as the design of an experiment or the choice of a statistical method can affect scientific findings.

A more broadly recognized manner in which science is not value neutral, and a basis on which the legitimacy of science is often challenged, is the intrusion of scientists’ personal values, and the values of the institutions that support them, into scientific work[8]. For instance, the selection of a research project is a value choice. It suggests that the scientist or institution believes the topic has value and that the actions that could stem from the research are desirable. Values emerge in the selection of stakeholder partners and collaborators. Values are conveyed in the advice scientists give and who they give it to[9]. Values are also apparent in the words scientists use – what they describe as degradation or improvement reveals the outcomes they favor[10].

Although values can compromise objectivity, society does not want scientists to be purely neutral investigators. Governmental funding agencies require scientists to directly consider public outcomes – hence the requirement for “broader impacts.” It is no longer viewed as sufficient for science to produce “reliable” knowledge; it is also seen as the responsibility of science to produce “socially robust” knowledge[11]. Science must attend to public concerns for the results to have societal usefulness. But what public concerns deserve attention? How should those topics be approached? And how should the results be interpreted to inform public decision making? Fulfilling scientists’ social responsibility requires the application of subjective values. Thus, scientists are called upon to attend to both objectivity and public impact – two ideals that are often in tension[12].

OUR RESEARCH

Our research group is laying the empirical groundwork to build professional discourse on the ethics of expertise. As a long-term result of this work, scientists will be better prepared to create public impacts that align with their ethical philosophies.

Impacts for advancing science:

  • Document scientists’ ethics of expertise and generate tools that support more nuanced research.
  • Support scientists in identifying and fulfilling their responsibilities to objectivity and public impact.
  • Protect trust in science and professional credibility.
  • Extend value framing research from what can we do? to what should we do? and from the domain of social scientists to the domain of all scientists.
  • Generate an empirical basis for the development of programs that help scientists engage the normative challenges of their work.

                                                                                                                                                                                             

[1] Gibbons, “Science’s New Social Contract with Society”; Ladd et al., “The ‘How’ and ‘Whys’ of Research”; Schnittker, “The Double Helix and Double-Edged Sword.”

[2] Donner, “Finding Your Place on the Science – Advocacy Continuum”; Ladd et al., “The ‘How’ and ‘Whys’ of Research”; Nisbet, “The Ethics of Framing Science.”

[3] Ladd et al., “The ‘How’ and ‘Whys’ of Research”; Wyndham et al., “The Social Responsibilities of Scientists and Engineers: A View from Within.”

[4] Bielefeldt et al., “Ethics Education of Undergraduate and Graduate Students in Environmental Engineering and Related Disciplines”; Pennock and O’Rourke, “Developing a Scientific Virtue-Based Approach to Science Ethics Training.”

[5] Campbell and Kay, “Solution Aversion: On the Relation between Ideology and Motivated Disbelief.”; Lewandowsky and Oberauer, “Motivated Rejection of Science”; Rosenbaum, Environmental Politics and Policy.

[6] Haraway, “Situated Knowledges.”

[7] Kuhn, The Structure of Scientific Revolutions.

[8] Lewenstein, “Science Controversies: Can the Science of Science Communication Provide Management Guidance or Only Analysis?”; Rosenbaum, Environmental Politics and Policy.

[9] Nisbet, “The Ethics of Framing Science.”

[10] Lackey, “Science, Scientists, and Policy Advocacy.”

[11] Gibbons, “Science’s New Social Contract with Society”; Ladd et al., “The ‘How’ and ‘Whys’ of Research”; Schnittker, “The Double Helix and Double-Edged Sword.”

[12] Donner, “Finding Your Place on the Science – Advocacy Continuum”; Ladd et al., “The ‘How’ and ‘Whys’ of Research”; Nisbet, “The Ethics of Framing Science.”

InternalError: Can't find method com.sun.proxy.$Proxy227.get(org.mozilla.javascript.Undefined,string). (#27)