Posted confidentially on the author’s behalf by the ethics blog editors
In order to minimize risks to informants, students setting out to conduct dissertation research submit human subjects protocols to be reviewed by their Institutional Review Board. But what happens when you find yourself working or consulting in non-academic settings, where there is no IRB to answer to, and the standards for work products and “intellectual property” are different from the standards surrounding research in academic settings? This is something most anthropologists don’t learn while they have their sights set on finding a job as a professor at a university. As anthropologists increasingly explore non-academic roles, we need to know how to protect ourselves, protect our informants, and avoid misunderstandings—and even lawsuits—in a business world that operates according to a different set of norms and cultural rules. I learned this lesson the hard way.
While working as an Executive Director for a non-profit organization, my training as an anthropologist specializing in gender, health, and the Middle East put me in a position not only to design and implement programs, but also to evaluate their successes and challenges. Our organization had several projects underway across the Middle East. I proposed to undertake an evaluation of a center we had helped fund in the Occupied Palestinian Territories. The goal was to assess how well the services our center offered were meeting community needs.
With the customary assurances of confidentiality, I began to conduct audiorecorded interviews with people who used the center. At first I asked simple questions like: How far do you have to travel to reach the center? How could the center better meet your needs? But when women began telling me stories of their experiences of living under occupation—the impact of the wall on their lives and villages, the settlers who threatened their children and livestock at gunpoint, the soldiers’ harassment of their sons and daughters, their sickness due to Israeli settlement waste trickling down the hills into Palestinian fields causing outbreaks of E. Coli and other sewage-related diseases, the destruction of Palestinian orchards and homes, and the constant, unrelenting harassment on every level of life—I could not stick to simply asking how many times a week they used the center and which programs they found most useful. Clearly, there was no way to understand the center’s functioning apart from the entire context of life under occupation. As an anthropologist and as a human being who could see these people’s suffering, I couldn’t keep focused on the center alone when people had stories of trauma they were eager to share.
But much of the information I recorded could potentially put people in danger if their identities, or even locations, were ever revealed. I obtained informed consent but documented it only orally, as many informants were illiterate and, more importantly, I knew that having their names on paper could put them at risk if the information was seized by officials at the checkpoints or the airport. In fact, I didn’t even carry the tapes out of the country, but sent them by DHL instead, for the same reasons. I emailed my field notes to myself at a different email address and then deleted them from both my computer and my primary email address. In writing up my observations, I used only initials rather than the full names of anyone I spoke to. In short, I made an exerted effort to minimize risk to my informants. But I didn’t think about who actually owned the original materials. I just assumed I was being paid to provide a report, and the recordings and notes were mine.
While in the field, I understood my materials to be two separate sets of information: one set of interview data directly related to the center, from which I would redact people’s identities and use to write up the report for my employer. The other information, unrelated to the center and my employment, explored the experiences of Palestinians living under occupation. I planned to use this latter information to write separate articles, as every Palestinian I spoke to pleaded with me to share their stories with the outside world.
My employer was fine with this way of thinking at the time. But some months after my return, my position was phased out, and the organization ordered me to turn over all of the tapes and notes from the work I had conducted in Palestine. I refused, on the grounds that (1) I had collected the data, (2) it contained sensitive information, (3) people had placed their trust in me when they consented to be interviewed, and (4) I was the only one whose responsibility it was to protect the interviewees. I concluded that I could not in good conscience hand over any of it.
I then received a letter from an attorney informing me that the organization had initiated a lawsuit against me, maintaining that I had stolen trade secrets and that the material belonged to them since they had paid my salary and funded the research. My employer’s stance was that this was intellectual property belonging to the company. My stance was that this data was protected by an obligation to “do no harm” to human subjects, and that social scientists have an obligation to protect their consultants’ identities—particularly in this high-conflict, militarized zone where lives could be at risk if certain information got out. I also felt that my own basic rights were being threatened: some of these field notes were like private journal entries; they contained my personal observations and emotions.
I started researching laws surrounding protections of human subjects, privacy laws, and cases involving other social scientists who had faced similar unfortunate circumstances and how they handled their situations. I found I was not alone in this predicament.
Sheldon Zink spent 18 months conducting ethnographic fieldwork at University of Pennsylvania on patients who had received artificial hearts. When one of these patients died, the family sued the hospital, and the hospital’s legal defense team subpoenaed Zink’s field notes (2003). Although she was prepared to spend time in jail to protect her informant’s confidentiality, at the 11th hour she managed to strike a deal to redact the notes and turn over an analysis, rather than the raw data. Sociologist Rik Scarce conducted his field research on eco-terrorism. In 1993, he spent 5 months in prison for contempt of court, and for refusing to divulge his sources, one of whom was under investigation for a 1972 murder. Zink and Scarce were employed in academic settings, yet they too did not have adequate protection for themselves or their informants. While it might seem logical that the protection of anthropological consultants be consistent with the privacy laws that protect the patients of medical professionals—or at least journalists’ informants—legally, this is not the case. But these cases were not exactly equivalent to the situation I was facing: who owns the data in a corporate/non-profit setting?
In most university settings, researchers are legally the ‘stewards’ of their data: while the researcher is the one responsible for ensuring that the research is conducted ethically, it is the university that technically owns the data. For anthropologists who do consulting work, or conduct anthropological research as employees of non-profit organizations or businesses, there isn’t usually an IRB to report to, and the employee or consultant is paid for the work they do on a particular project.
Anthropologists are increasingly bringing their skillset to other work terrains in corporate America and non-profit work. Whether we step into the Intels, Googles, Facebooks, or Relief Internationals and other foundations, we have to come to terms with the fact that the standards regarding research materials may be very different from how we’ve been trained to think about our data. We need to become adept at negotiating contracts that explicitly spell out who owns the data and what information is being paid for before we engage in fieldwork. Every social scientist who conducts qualitative, ethnographic work with human subjects should legally establish (1) who owns the recordings, notes, and other data they produce, (2) to what degree they can or cannot use this data in their own publications, (3) that they are being paid to provide analyses and reports of their findings with all identifiers redacted, not the raw data, as part of our ethical guidelines to protect human subjects. We must consider proactively how far we would go to protect our informants: Destroy tapes? Spend time in jail?
If this case had gone to court, and I had won, it would have set a precedent for social scientists’ right to protect their data. If not, however, the outlook for the protection of our work, and human subjects, would not have been as rosy. Luckily for me, after a year of attorneys’ meetings, and significant financial and emotional costs, my former employer dropped the case, we reconciled our differences, and I was able to stand by what I believe in. But I also learned the hard way that naiveté is a luxury we cannot afford when it comes to anthropological “data security”. While we are very fortunate that companies and organizations now recognize the great value of qualitative anthropological work—and ethnography is becoming the new buzzword in much of corporate America—we also have to adapt to an entirely different set of cultural circumstances. Other people’s lives could very well be on the line if we venture into these arenas blindly.