Making science more moral

A bit over a month ago, Raj Pandya , who directs AGU’s Thriving Earth Exchange, shared a blog post with me that he’d published on the website of AGU’s Ethics and Equity Center[1]. What a great set of thoughts! Thanks, Raj, for giving permission to reprint it here. And thanks as well for your decades of leadership by example – doing your bit every day to make science (and its applications) more moral.

Photo courtesy of Nathan Fried-Lipski photography

By Raj Pandya, Director, Thriving Earth Exchange

There is a neighborhood near my grandfather’s house in Mumbai that is one of the largest unplanned urban neighborhoods in the world—what is more commonly, but not nicely, called a slum. It’s a densely packed region of narrow dirt lanes where whole families live in tiny rooms, sometimes in houses constructed out of salvaged materials, without trash collection, running water, or sewers. People who live there live without property rights—without any assurance that everything they have won’t be taken from them when a decision is made to redevelop or ‘beautify’ the area.

Running through that neighborhood is a pipe that delivers clean water to my grandfather’s affluent neighborhood. There are no public taps in the pipe, so people in the slum can’t access the water. The pipe runs alongside a former stream, now a concrete walled-flood channel, that is choked with debris and polluted with human waste.

It might be comforting to think this kind of thing doesn’t happen in the US.

It does.

In communities across the US, the Thriving Earth Exchange has worked with people tired of being on the receiving end of the noise and waste and hazards of the roadway fracturing their neighborhood, the factory spewing noxious fumes, or the rivers that bring contaminants and floods into their homes. Many are communities of color and poor communities, who are disproportionately likely to suffer polluted air, water, and soil. Many of these communities feel like science is being used against them—large corporations, enthusiastic developers, and inattentive local leaders are using science to downplay the impact on neighborhoods—even though people in the neighborhood can smell the fumes, see the emissions, and watch the floods get worse. Often, these neighborhoods have been historically and systematically denied access to the resources they need to protect themselves from natural hazards. Sometimes, they can’t even prove they own the land because there was no sanctioned system to register deeds.

This disproportionate environmental burden imposed on poor communities, seen the world over, raises questions about the morality of science and technology. Why is it okay to run a closed pipe through a poor neighborhood to provide water for rich neighborhoods? Why do we accept the destruction of a natural ecosystem and rely on technological solutions? Why is it okay to magnify the vulnerability of poor neighborhoods by turning a floodplain into a channel or building a factory nearby?

For too long, I think, we scientists have been able to dodge moral responsibility for impacts like this. We dodge by appealing to the intrinsic morality of advancing knowledge, pretending that science is objective, separating what we discover from how it is used, partitioning science from other ways of knowing (and often holding it above other ways of knowing), and gatekeeping who can be part of science and who gets to ask scientific questions [1].  In a book of sermons titled Strength to Love, Dr. Martin Luther King Jr. wrote that, as we try to determine and act on what it means to live a moral life, “our scientific power has outrun our spiritual power.” How can we bring the two back into balance?

  • We need to pay attention to the applications and impacts of our work, especially negative ones. Many of the horrifying examples of immoral science are associated with a myopic focus on discovery and a failure to consider impact, especially on the vulnerable. The Willowbrook experiment, in which some institutionalized adults were deliberately infected with hepatitis to study the disease’s transmission, offers one example. Even unintentional impacts of science and technology—the ozone hole, climate change, and antibiotic resistance—represent a failure to fully consider and invite broad deliberation about the uses of new discoveries. Often, this failure falls hardest on the vulnerable and the excluded.
  • We need to accept, perhaps even embrace, the imperfect nature of science. When we work in our labs, write our codes, or study our subjects, the tentative nature of understanding is a foundational principle. We need to apply that same idea to the practice of science. You can’t get better if you think you’re already perfect.
  • We need to welcome diverse people, voices, perspectives, and even epistemologies. We have to reach for perspectives and ideas from beyond science—from ethics, spirituality, humanities, religion, and art. We should invite experience and viewpoints from beyond our scientific ways of thinking. Science, alone, isn’t enough to address the moral challenges it creates. Our science has to be a participant in a larger civic discourse about where we want to go, and that discourse needs to guide what science we do, where we do it, and how we do it.
  • We need to acknowledge our values and act on shared values. Too often, we imagine scientists are purely rational actors who make decisions based on evidence. But evidence isn’t the only guiding force for scientists, nor should it be. Equity, for example, is worth pursuing because it is the right thing to do. We protect vulnerable people because of shared values, not because of a scientific argument. Instead of pursuing an ideal of objectivity, we need to recognize that values will help guide science and think carefully about what we want those values to be.[2]

Community science—when scientists and communities, especially historically marginalized communities, work together to design and do science that advances community priorities—is a way to accomplish these things. In community science, scientists and communities work together to decide what science to do, how to do it, and how that science will be used. Working together creates a space to draw on multiple kinds of experience and knowledge. In designing for real-world impact, community science necessarily grapples with ethical and moral questions about who uses science and to what ends. By respectfully partnering with diverse communities, we access a broader range of values, ask questions that haven’t been asked before, and avoid the inadvertent use of science to enhance inequity. Most importantly, by designing science with communities, community science allows us to grapple with moral issues, consider values, and weigh options in a much larger and more inclusive context.

[1] Peer review has many advantages, but if all the peers are white, middle class, and privileged, it is hard to see how the concerns and issues of minority and disadvantaged populations influence a research agenda. That is gatekeeping.

[2] We also need to respect differing values – some problems with science arise from assumptions scientists make about the superiority of scientific values.

[1]AGU’s new platform to help advance and promote a positive work climate in science, providing tools and resources for researchers at all career stages.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *