This just in…Data (from a recent study) tend to confirm scientists are human too…

Hopefully blog readers will try a little experiment with me today. As you know, I read The Economist, and I’m always quite a bit behind…I’m going to summarize an article in the January 21st issue for you. As you read what follows here, please keep track of your emotions. How is reading this making you feel?

Here goes…

Entitled Ignorance is bliss, the article cites the following paper: Dror, I. E. & Hampikian, G. (2011). Subjectivity and bias in forensic DNA mixture interpretation. Science & Justice, 51 (4), 204-208.  The article is worth reading in its entirety, but here’s the gist as I understand it. Turns out that DNA evidence isn’t always quite so conclusive as we might hope, or may have been led to believe. [Sometimes, a DNA sample will include an admixture of DNA from two or more human beings. Other things can go wrong.] In practice, therefore, DNA matches require a bit of interpretation. In their paper, Dror and Hampikian studied results from a criminal case in Georgia. They found that the two original forensic analysts on that Georgia case, and familiar with particulars of that case, could not exclude one of several suspects from involvement based on DNA results. Dror and Hampikian then gave the same DNA evidence to seventeen analysts who knew nothing about the crime and the context. Only one of the seventeen thought the suspect couldn’t be excluded from involvement. Four abstained. The other twelve excluded him. For Dror and Hampikian, the takeaway was that cognitive bias can creep in; forensics experts can get too close to the cases they’re working on. The Economist suggested that considering the stakes (imprisonment or freedom for criminal suspects; justice or lack of closure for crime victims) such analyses should perhaps be “blind,” as is the practice in drug clinical trials.

How’s your emotional state at this point? Are you unmoved? Perhaps even wondering why you’ve bothered to read this far? Then you’ve probably got lots of company.

But suppose we bring this story a little closer to home. Suppose, and these are hypotheticals here, someone had studied the analysis of water quality samples from the New York Bight, and found that ocean chemists who knew the origin of the samples found statistically-significant higher concentrations of pollutants than those who had been told the samples were from further out to sea. Suppose someone sampled  the “accuracy” of weather forecasts (using any yardstick), as a function of distance from a forecast office, and found a “bull’s eye effect,” namely that accuracy decreased with increasing distance from the office putting out the forecast. Suppose someone found that climate scientists…(I don’t even have to finish the sentence, do I?).

As these anecdotes hit closer to home, they start to move the emotional needle, don’t they?

We all know that every field of science has generally progressed over time, showing a series of successes and advances, but each field has a parallel history of instances where our science encounters our humanity in this way. And each occurrence can be hugely emotional for those involved.

It’s true for our friends in forensics. The Economist article states that “forensic scientists are beginning to accept that cognitive bias exists, but there is a lot of resistance to the idea, because examiners take the criticism personally and feel they are being accused of doing bad science.”

In another context, that would be us, wouldn’t it? Or policymakers, who feel they’re being accused of bad policy formulation, or emergency managers, or city planners, or any other professional group…

You and I are human. We can resist our tendency to bias, we can try to override it, but the evidence is at some level we often… maybe always…fail…maybe not by a lot, but by a little bit. And the evidence is we get emotional when someone calls us on it.

Most of the time, it seems our human tendency is to hope the subject won’t come up…or least we won’t be the target.

But there’s another policy we might adopt…that is…to be a little slower to accuse when we see these failings in someone else, or in some other group…or at least a little more muted in our criticism. And by the same token, we can be the first to confess our human failings, so that anyone else calling attention to them is only piling on, confirming what we know already to be true. We can also see ourselves as part of a larger enterprise, working to improve the entire human condition, not just our self-interest.

If we look around, we see people behaving this way all around us, making the world a more functional place, and making life more pleasant while they’re at it.

Shall we join them?

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Your email address will not be published.