CITI update, and broader thoughts on ethics training and behavioral research

Yesterday after I emailed the CITI program about their continuing misrepresentation of Milgram, I got an email back from them apologizing and promising a revision (deja vu?). This time, I have asked them to keep me updated on any changes.

I think this was an error of omission, not commission. But that doesn’t absolve them. CITI’s courses are required of all research personnel at my institution (and lots of others), and I assume they are being paid for their services. They need to get it right.

In my correspondence with CITI so far I have focused narrowly on Milgram. But there were other problems with the training program, some of which may be symptomatic of how the larger IRB-regulatory system views social and behavioral research.

Milgram, in more detail

The most obvious problem with CITI was that Milgram’s obedience studies were described as unethical. That’s just not defensible. In fact, Milgram’s research was replicated with IRB approval just a few years ago. Milgram may be relevant for a training course on ethics, but not as an exemplar of bad research.

One way that Milgram is relevant is because of what his research tells us about how subjects may respond to an experimenter’s authority. It would be reasonable to conclude, for example, that if a subject says they are considering quitting an experiment, researchers must be careful not to pressure the subject to stay enrolled. (Saying “the experiment requires that you continue” is pretty much out.)

Milgram also found that obedience varied as a function of a variety of factors, including the physical and psychological distance between the subject and the “experimenter.” For example, when instructions were delivered by telephone rather than in person, obedience rates were much lower. In a modern context, we might make some reasonable inferences that coercion may be less of a possibility in Internet studies than in face-to-face lab experiments. Quitting an experiment is a lot easier if it’s just a matter of closing a browser window, rather than telling a stern experimenter in a white lab coat that you want to go home.

I’d also say that Milgram’s research is also relevant to interactions among research personnel. It’s a good reminder of the responsibility that PIs bear for the actions of their research assistants and other staff. This doesn’t mean that front-line personnel cannot or should not be held responsible for their actions. But it is wise to recognize that a PI is in a position of authority, and to consider whether that authority is being used wisely.

Milgram’s obedience research is also specifically relevant for IRBs and other ethics regulators, who are in a position to prevent research from being carried out. Milgram has profoundly affected the way we understand the behavior of ordinary people in Nazi Germany (and yes, for that reason CITI’s misrepresentation is especially perverse). It has been enormously influential across a wide range of basic and applied research in social psychology and other behavioral sciences: Google Scholar reports over 4000 citations of Milgram’s book Obedience to Authority and over 2000 citations of the original academic paper. How much would have been lost if a skittish IRB had read over the protocol and insisted on watering it down or rejecting it outright?

Coarsening comparisons

Beyond just Milgram though, there are other problems with the course — problems that may be emblematic of larger flaws in how the regulatory community thinks and talks about behavioral research.

Even if you accept that some of the other studies cited by CITI were unethical, a major problem is the coarsening comparison to Nazi medical experiments and the Tuskegee syphilis study. Gratuitous Nazi comparisons are so common as to be a running joke on the Internet. But in more serious terms, organizations like the ADL rightly object when public figures make inappropriate Nazi comparisons. Such comparisons do double damage: they diminish the real suffering of the Holocaust, and they overinflate whatever is being compared to it.

Let’s take what one commenter suggested is the (relatively) worst of the behavioral studies CITI cited: the Tearoom Trade study. A researcher observed men having sex with other men in a public bathroom, and in some cases he followed them home and posed as a health service interviewer to gather demographic data. Does that belong in the same category as performing transplantations without anesthesia, burning people with mustard gas, and a long list of other atrocities? Or even with knowingly denying medical treatment to men with syphilis?

Such comparisons cheapen the Holocaust and other atrocities. They also suggest grossly overblown stakes for regulating behavioral research. This mindset is by no means limited to CITI – references to the Nuremberg trials are de rigeur in almost every discussion of research ethics. In relation to what goes on in most social and behavioral studies, that’s absurd. That’s not to say that behavioral research cannot do real and deep harm (especially if we are talking about vulnerable populations). But ethics training ought to help researchers and regulators alike see the big picture and sharpen our perspective, not flatten it.

In favor of balance

Ideally, I’d like to see ethics training take on a broader scope. I’d love to see some discussion of reasonable limits on IRB reach. The CITI History and Ethics module states: “Highly motivated people tend to focus on their goals and may unintentionally overlook other implications or aspects of their work. No one can be totally objective about their own work.” In context, they are talking about why research needs to be independently reviewed. But the same sentences could apply to regulators, who may be biased in favor of regulatory review, and who may underestimate the slowing and chilling effects of regulation on researchers.

A great deal of behavioral science research consists of speech: a researcher wants to ask consenting adults some questions, or show them some pictures or videos, and then record what they say or do. Legal scholar Dale Carpenter has suggested that all IRBs should include a first-amendment expert. That’s not likely to happen. But academic freedom is central to the mission of universities. The AAUP has raised concerns about the effects of IRB regulation on academic freedom. Wouldn’t it be a good idea to make sure that ethics training for university researchers and regulators includes some basic coverage of academic freedom?

Then again, training is itself a burden. Ethics training is necessary, but if you overstuff it, you’ll just lose people’s attention. (The same can be said of consent forms, by the way.) I’d just settle for some basics with perspective. Principles of informed consent; working with vulnerable populations; evaluating risk; protecting privacy. All of these are important matters that researchers should know about – not to keep us from morphing into Nazis, but out of everyday decency and respect.

CITI is still misrepresenting Milgram’s obedience research

Unbelievable.

Two years ago, after taking the Collaborative Institutional Training Initiative (CITI) online ethics training course required by my institution, I wrote to them to object to the way Milgram’s obedience research was characterized in their “History and Ethics” module. Short version: they compared Milgram’s research to Nazi medical experiments and the Tuskegee syphilis study.

In response to my email, I got a very nice sounding reply from a CITI staff member. Quoting from her email:

I agree with you.

The module was adapted from a module written for biomedical researchers. When it was adopted, in order to make it more relevant for researchers in the social and behavioral sciences, the writers simply added cases that seemed more relevant. The important distinctions you note were not made.

I would like to revise the module completely and the only obstacle right now is time.  I will see if I can get some minor changes approved, in the meantime, that will address your issues. One simple solution might be to change the introductory language to the case studies and remove the word scandals where it is not appropriate.

But yesterday when I took my requried CITI refresher course, I discovered that the promise was an empty one. Not a word has been changed.

So I’m back to writing to them. Below is my latest email. Below it, I have quoted the objectionable text from the CITI course.

Dear CITI staff:

Two years ago, I wrote to you to object to the way that Stanley Milgram’s obedience research, and other behavioral science research, was portrayed in your “History and Ethics” module. That email is appended below.

The crux of my objection was that CITI mischaracterized Milgram’s research as unethical and drew parallels to Nazi medical experiments and the Tuskegee syphilis study. To the contrary, Milgram’s obedience research was conducted ethically (in fact, it was replicated with IRB approval just a few years ago). It is indeed relevant to contemporary research ethics — not as an exemplar of harmful research, but because of what it teaches us about how research subjects may respond to scientific and institutional authority.

In response, I received a message from Lorna Hicks (appended below) in which she stated that Paul Bruanschweiger had forwarded her my email. She stated quite bluntly, “I agree with you.” She assured me that CITI would update its materials. At that time, I was pleased both with CITI’s prompt responsiveness to feedback as well as with the specific substance of the reply.

So perhaps you can imagine my surprise and dismay when I sat down to take the CITI refresher yesterday — two years later — and discovered that Milgram and several other behavioral studies are still being described as “similar events” to Nazi war crimes and the Tuskegee syphilis study. Despite your assurances made two years ago, the module has not been changed to remove the objectionable comparisons.

So once again, I am writing to you to strongly object to your characterization of Milgram’s obedience research. You are doing a disservice to the legacy of an important body of behavioral science research, and you have continued to do so for several years despite agreeing that it was wrong and promising to stop.

Sincerely,
Sanjay Srivastava

Here is the text of the CITI “History and Ethics” module that I objected to, as it appeared both in 2009 and 2011. I have quoted at length to provide the full context, so you can see the comparison for yourself. Boldface emphases have been added by me.

The development of the regulations to protect human subjects were driven by scandals in both biomedical and the social/behavioral research, and as such reflect social concerns regarding research involving human subjects including:

* The importance of meeting the requirements of basic ethical principles underlying the involvement of humans as research subjects

* The need for independent, objective review of research

* The need to preserve the public trust in research involving human subjects

1.0 Historical Development

The events that led up to the development of the currently regulatory system occurred in both biomedical and social/behavioral research.

1.1 Events in Biomedical Research

Attention to the ethics of human subjects research first received wide-spread attention after WWII with revelations of the Nazi “research” which led to the Nuremburg Code, a statement of ethical principles of human experimentation. In 1964, the World Medical Association developed a code of research ethics that came to be known as the Declaration of Helsinki. It was a reinterpretation of the Nuremberg Code, with an eye to medical research with therapeutic intent. In 1966, Dr. Henry K. Beecher, an anesthesiologist, wrote an article (Beecher HK. “Ethics and Clinical Research” NEJM June 16, 1966) describing 22 examples of research studies with controversial ethics that had been conducted by reputable researchers and published in major journals. Beecher’s article played an important role in heightening the awareness of researchers, the public, and the press to the problem of unethical human subjects research.

One of the seminal events in the development of the current regulatory environment was the Public Health Service (PHS) Syphilis Study (1932 – 1972), the so-called “Tuskeegee Syphilis Study”. Initiated and funded by the PHS, this study was designed to document the natural history of syphilis in African-American men. Hundreds of poor, African-American men with syphilis were enrolled into the study. The men were recruited without informed consent and were deliberately misinformed about the need for some of the procedures. This longitudinal study lasted over 40 years until newspaper reports forced the US government to terminate the study. For more information follow the link to the PHS Syphilis Study.

1.2 Events in Social & Behavioral Research

Events contributing to the development of the current regulatory system were not limited to biomedical research; during the same period there were several similar events in the social and behavioral sciences: The Wichita Jury Case (1953) where researchers tape recorded jurors’ deliberations in six cases to measure influence of attorney comments on decision making. The research was conducted with knowledge of the judge and attorneys, but not jurors. The Milgram “Obedience to Authority”(1963) studies which were conducted to determine how far subjects would go in administering seemingly severe electric “shock” as directed/instructed by an authority figure (to continue when the experimenter) to another subject (a confederate) even when the latter subject appeared to be in extreme pain but continued to fail test questions. Humphreys “Tearoom Trade” study (1970), which involved the observation of men engaged in sex acts in restrooms, secretly following them to their cars, transcribing license plate numbers, tracking them through DMV records to their homes and interviewing them about personal issues. The Zimbardo “Simulated Prison” (1973) research, which involved assigning roles to male student volunteers as “prisoners” and “guards”. The research became so intense as physical and psychological abuse of “prisoners” by “guards” escalated, that the researcher stopped the experiment/simulation after six days. See Dr. Zimbardo’s web site for more details on this study.

UPDATE (7/8/2011): I heard back from CITI.

Stanley Milgram was an evolutionary psychologist

And an interactionist. From Obedience to Authority:

A potential for obedience is the prerequisite of such social organization, and because organization has enormous survival value for any species, such a capacity was bred into the organism through the extended operation of evolutionary processes. I do not intend this as the end point of my argument, but only the beginning, for we will have gotten nowhere if all we can say is that men obey because they have an instinct for it.

Indeed, the idea of a simple instinct for obedience is not what is now proposed. Rather, we are born with a potential for obedience, which then interacts with the influence of society to produce the obedient man. In this sense, the capacity for obedience is like the capacity for language: certain highly specific mental structures must be present if the organism is to have potential for language, but exposure to a social milieu is needed to create a speaking man. In explaining the causes of obedience, we need to look both at the inborn structures and at the social influences impinging after birth. The proportion of influence exerted by each is a moot point. From the standpoint of evolutionary survival, all the matters is that we end up with organisms that can function in hierarchies.

I’d take Milgram’s endorsement of interactionism one step further and say that “the proportion of influence” isn’t just moot – it’s nonsensical. You don’t have a human mind unless you have inborn mechanisms and social influences working in concert with each other.

Milgram is not Tuskegee

My IRB requires me to take a course on human subjects research every couple of years. The course, offered by the Collaborative Institutional Training Initiative (CITI), mostly deals with details of federal research regulations covering human subjects research.

However the first module is titled “History and Ethics” and purports to give an overview and background of why such regulations exist. It contains several historical inaccuracies and distortions, including attempts to equate the Milgram obedience studies with Nazi medical experiments and the Tuskegee syphilis study. I just sent the following letter to the CITI co-founders in the hopes that they will correct their presentation:

* * *

Dear Dr. Braunschweiger and Ms. Hansen:

I just completed the CITI course, which is mandated by my IRB. I am writing to strongly object to the way the research of Stanley Milgram and others was presented in the “History and Ethics” module.

The module begins by stating that modern regulations “were driven by scandals in both biomedical and social/behavioral research.” It goes on to list events whose “aftermath” led to the formation of the modern IRB system. The subsection for biomedical research lists Nazi medical experiments and the PHS Tuskegee Syphilis study. The subsection for social/behavioral research lists what it calls “similar events,” including the Milgram obedience experiments, the Zimbardo/Stanford prison experiment, and several others.

The course makes no attempt to distinguish among the reasons why the various studies are relevant. They are all called “scandals,” described as “similar,” and presented in parallel. This is severely misleading.

Clearly, the Nazi experiments are morally abhorrent on their face. The Tuskegee study was also deeply unethical by modern standards and, most would argue, even by the standards of its day: it involved no informed consent, and after the discovery that penicillin was an effective treatment for syphilis, continuation of the experiment meant withholding a life-saving medical treatment.

But Milgram’s studies of obedience to authority are a much different case. His research predated the establishment of modern IRBs, but even by modern standards it was an ethical experiment, as the societal benefits from knowledge gained are a strong justification for the use of deception. Indeed, just this year a replication of Milgram’s study was published in the American Psychologist, the flagship journal of the American Psychological Association. The researcher, Jerry M. Burger of Santa Clara University, received permission from his IRB to conduct the replication. He made some adjustments to add further safeguards beyond what Milgram did — but these adjustments were only possible by knowing, in hindsight, the outcome of Milgram’s original experiments. (See: http://www.apa.org/journals/releases/amp641-1.pdf)

Thus, Tuskegee and Milgram are both relevant to modern thinking about research ethics, but for completely different reasons. Tuskegee is an example of a deeply flawed study that violated numerous ethical principles. By contrast, Milgram was an ethically sound study whose relevance to modern researchers is in the substance of its findings — to wit, that research subjects are more vulnerable than we might think to the influence of scientific and institutional authority. Yet in spite of these clear differences, the CITI course calls them all “scandals” and presents them in parallel, and alongside other ethically questionable studies, implying that they are all relevant in the same way.

(The parallelism implied with other studies on the list is problematic as well. Take for example the Stanford prison experiment. It would arguably not be approved by a modern IRB. But an important part of its modern relevance is that the researchers discontinued the study when they realized it was harming subjects — anticipating a central tenet of modern research ethics. This is in stark contrast to Tuskegee, where even after an effective treatment for syphilis was discovered, the researchers continued the study and never intervened on behalf of the subjects.)

In conclusion, I strongly urge you to revise your course. It appears that the module is trying to get across the point that biomedical research and social/behavioral research both require ethical standards and regulation — which is certainly true. But the histories, relevant issues, and ramifications are not the same. The attempt to create some sort of parallelism in the presentation (Tuskegee = Milgram? Nazis = Zimbardo?) is inaccurate and misguided, and does a disservice to the legacy of important social/behavioral research.

Sincerely,
Sanjay Srivastava

UPDATE: I got a response a day after I sent the letter. See this post: A very encouraging reply.

UPDATE 7/6/2011: Scratch that. Two years later, they haven’t changed a thing.