Wednesday, August 19, 2009

Chaddie Sheperd Meets the Ethics

Today was Day 1 of "Professional Ethics Training", which even sounds eerily suspicious. I'm not sure how you can be trained to be ethical. No ethical dilemma that inevitably comes up with any job in any field is the same, and reactions to these dilemmas are commonly unpredictable, especially by the very people involved in these scenarios. I'm also not convinced that one can be trained to recognize these situations, or perhaps more correctly, that one *needs* to be trained to recognize these situations. Most people involved in ethical dilemmas immediately and instinctively recognize them as such, even if they would not label them in that manner.

People have a remarkable capacity to act against their better judgments in situations that seem overwhelmingly out of control. In yesterday's football rant, I suggested that most people tend to become coaches after a big loss, as if they somehow would have done a better job. I further suggested that these same people have no idea of the pressures and situational complications that arise in big-game situations, making their criticisms demonstrably arrogant and naive.

In the training session today, we were introduced to a psychological study that we know is prevalent in human behavior, whether we'd like to admit it or not. It is the so-called Milgram Study, in which the subjects were told by a "teacher" to provide increasingly high-voltage shocks to the learners whenever an answer they provide to a set of questions was incorrect. Of course, the "learner" was a part of the study and the real person being tested was the "shocker". In the study, nearly two-thirds of the shockers would continue to shock the "learners" with higher and higher voltage despite the increasingly audible (though completely fake) yelps of pain from the "learner". When the "shockers" would complain about the moral issues of the study, the "teacher" would continue to say that the study depends on the "shocker's" work and that the "teacher" would take full responsibility for the study. So the "shockers" continued.

This classic and very telling study illustrates how humans tend to act under authority. But in video tapings of the Milgram study (one of which was shown during the training session today), the actual subjects nearly always vocally complained at some point during the study over the troubling indicators that the "learner" was in pain. In other words, despite the subject's inability to act ethically during an ethically questionable situation, the subject nearly always recognized the situation as ethically questionable.

Guilt is a powerful ally in situations in which someone is tempted to do the "unethical" thing. For example, if I were to accidentally run into another person's vehicle in a parking lot, in which there were no witnesses, I would have a difficult time leaving the scene without notifying the other vehicle of my insurance information. Not because of any selflessness that I possess, but rather because the sense of guilt that would envelop me in the following days would be too much for me to take. This, by the way, is commonly overlooked by religious followers who can't understand the "moral compass" that many nonreligious people possess. Selfishness and the imperative for the social good commonly result in ethical behavior, despite other selfish alternatives and the lack of supernatural "punishment".

I happen to believe that people *always* act selfishly, no matter the circumstance. This may be an overgeneralization and is certainly biased from my own perspective, but listening to heroic story after heroic story, there is universally a personal element that at the very least suggests the hero does the heroic, seemingly unselfish action, because of a very selfish emotional conundrum. People often choose to do the unselfish thing because they have to live with whatever choice they decide.

This brings me back to the "training" aspect of ethics. As a scientist, I am obligated to advance the field, and this very powerful "belief system" outweighs any situation that may come up that poses an ethical dilemma. For example, if it turns out that something I do and report in meteorological journals is false or "no longer correct", it is my personal responsibility (indeed, obligation) to report it as such as soon as I possibly can. No matter how bad I may look as a result of the correction, the overwhelming guilt of "corrupting" or "impeding the forward progress" of the science outweighs any perception of me that would result. As a wise adviser recently told me: "My loyalties are to myself, and to the science." Words to live by, since the two are not inseparable in my "belief system".

Perhaps the more important thing to look out for is how ethics are omnipresent in virtually everything that you do. It comes up in meteorology in many ways, including authorship issues, what to include and not include in publications, how financial and time constraints with funding affect the scientific bottom line, etc. Maybe most importantly, it is there in the personal relationships you have with your peers. Is it harder or easier to review a friend's scientific work, even under the auspice of anonymity? (And is anonymity the best way to peer review? I think this is a very debatable issue, at the very least!) Can people stoke the fire in a work environment they are happy with? How does an ethical dilemma affect those you work with, and what are the perspectives of other people?

All of these topics were brought up in the training session today, but these are things that, I believe, the good scientists (and employees in general) already think about constantly. Not doing so is what is so unethical.