Working draft:

John Mueller

"Problematic" past research: What can we learn? Anything? Nothing?

The "problem" cases that are widely cited are all subject to the same question: Would what we now do have prevented this? If not, and I think that is the conclusion in almost all cases, then let's stop using them to create a moral panic for over-controlling social sciences research. These historical "problem" cases are being used as opportunistic subterfuges to introduce irrelevant regulations into the review process, especially in the social sciences.

In justifying the extensive screening of social sciences research proposals by research ethics boards (REB; or IRBs in the States), certain past research projects are repeatedly noted. In identifying these horror stories, the implication is that they would have been prevented by the sort of screening now imposed on campus. However, the argument is badly strained, and in many cases outright flawed. I will note a few of these infamous cases here, and then briefly comment on why the case does not justify present ethics reviews in social science research.

In the past, I did not worry much about these cases, in part because I have never said that social sciences research should be completely exempt from review, just that such research proposals should be accorded the proportionate and expedited review that most of them warrant. It is clear though that many people are trying to justify excessive reviews and superfluous regulations because of these "worst cases." Therefore, some challenge seems in order: maybe we could learn from history here, but that's most likely if we do it rationally rather than emotionally. Are we really trying to prevent these problems from recurring, in which case we are failing badly, or is there another agenda?


BACKGROUND. The centerpiece of these bogus claims is the implication that the atrocities indicted at the Nuremberg (1946-1948) trials would have been prevented if campus research ethics boards had existed. This is just nonsense, but it is slipped in regularly and all too often just passively accepted as a justification. Sorry, but neither mad scientists nor criminals are going to bother submitting research proposals to the campus committee, so the modern rigmarole cannot be justified on the grounds it prevents such things.

I realize in saying this that a cry will no doubt go out that I approve of these things, which is of course merely another symptom of the logically challenged world we find ourselves in these days. Read carefully: it isn't a question of approval, just whether our current rituals would have prevented them. To avoid the appearance of endorsing these war crimes, we too often let ourselves be silently guilted into accepting the Nuremberg trials as a justification for all that we do today. Just the result the moral panic strategy seeks, self-censorship.


Some, actually most, of the horror stories cited are MEDICAL research, and I simply don't feel inclined to take any ownership by defending these. I think they speak volumes about the hubris of the medical community, but their relevance in justifying overkill in Social Science reviews is just nil. Here are a few problem cases. These are being widely used by universities in workshops "educating" faculty about research ethics. These workshops present these as the reasons we have IRBs, but the workshops say absolutely nothing to demonstrate that the historical problems would have been prevented by the present procedures, even in medical research.

A study by Beecher (New England J. Medicine, 1966, 274, 1354-1366) noted some 22 such studies published in high-quality journals, by people at prestigious institutions. How many more didn't make it through peer-review? The Declaration of Helsinki, 1964, was directed at such hubris in medical research. However, note it did not stop the Tuskegee project.
<soapbox>
These unfortunate cases are notable for social sciences research for one reason only: They are good examples of why research ethics for the entire campus should NOT be modeled on the ethical problems of the Medical School: Medical research has problems of "arrogance" and profits that are at least an order of magnitude worse than elsewhere (IMHO). It isn't exactly that non-medical researchers are more principled, but that "life saving" is noble and profits are heady temptations -- and both are powerful alibis after the fact. Stir in the contemporary pressure on faculty by university administrations to get grants and contracts, and it becomes a volatile mix indeed. As psychology goes, this isn't rocket science, and a great deal of the problem can be solved without complex government regulation.

Furthermore, because much medical research can be lethal, it is hard to argue against substantial screening of such research. One can thus justify the Helsinki Declaration, and the Belmont Report (1979) for medical research, but their medical research focus is at best irrelevant, and most often counter-productive, for social science research.

Finally, note that some of these were conducted under government auspices (e.g., Tuskegee), as were the Nazi atrocities. And CIA projects (and military research) are not going to be affected by research ethics reviews, so the existence of IRBs can't be justified by citing these abuses either. Mad scientists and criminals are not going to submit the forms, so the process can't prevent the abuse, deal with it. Isn't there something that makes you uneasy about trusting government to fix this? Why feel safe now? Who is watching the new regulators? (After all, these are the folks who assured us that the Income Tax was just a temporary war tax, and ditto witholding tax.)
</soapbox>


One past project is interesting in that it seems to speak to issues in interdisciplinary research, such as different values and differences in conventional wisdom, as much or more than matters of safety. It also is interesting because it was never published, and therefore much of what is "known" about it is hearsay. This is not uncommon actually. It is often a caricature of the problem case, an urban legend, that is used to create the moral panic for greater control.


Another past project that has captured attention is described as "exposing children to lead hazards." Except that's not exactly the "whole truth," and in the end this case seems to say more about the legal profession than research ethics.


A recent project in asthma research resulted in a death, apparently due to the treatment.


Some high-profile incidents involved psychological research. Such research doesn't involve "lethal" treatments, but one does not need to die for there to be serious questions even in medical research. However, what we need, what we have lost, is some perspective on "everyday psychological discomfort." In other words, there must be a psychological counterpart to what medical treatments describe as side effects: "May cause nausea, upset stomach, dizziness, headaches, itchiness, etc." Aren't we as robust psychologically?

There is no such thing as "zero risk" and no need to pursue it, physically or psychologically. Some people have assumed any psychological reaction is harmful and cause for vetoing a project. We need to realize that two aspirin and a good night's sleep do wonders for most psychological problems, just as for medical side effects. And that other medical standby, "In case these reactions are severe suspend taking the medication," surely applies psychologically as well. Unless, of course, one has a vested interest in creating and maintaining "victims"?

  1. Milgram's obedience research (1961-1962)
  2. Zimbardo's prison study (1971)
  3. Rosenhan's "Being sane in insane places" (1973)
  4. Schacter and Singer, Suproxin (1962)
  5. Murray and the Unabomber (1959-1962)
  6. Zillman and Bryant, pornography (1980s)


The Alliance for Human Research Protection tracks the news for reports of alleged problems in research. I'm glad this is being done by someone, but a lot of the commentary at AHRP seems overdone and alarmist.

Who should be involved in these "problem cases"? the law instead of IRB? put them in jail? ====

There is a difference between learning from the past and living in the past. Hindsight is typically pretty good, but you lose that advantage if you are just nursing some ideology, such as "control is good."

Finally, having denied the value of the Nuremburg trials as a bogeyman for modern IRB practices, I actually do think there are two things of significance from the Nuremburg trials, but the ethics industry overlooks these:
  1. It was established that "just following orders" was not an acceptable defense;
  2. Somehow I don't think "protecting the institution from risk" would have been favorably received by the judges either.


Personal views, not those of my employer, etc.