Crossing the line with ‘intimate truths’: Can we do ethical research on sensitive topics?

By Scott Maher

Researchers are often asked to investigate sensitive topics — things that are very private, personal, embarrassing, or even incriminating to research participants. This is an ethical minefield.

I’d like to share a story about something that happened at UX Australia’s Design Research 2017 conference this afternoon, as a way in to talking about the ethical challenges of doing research on sensitive topics. (Side note: the conference was fantastic. You missed out if you weren’t there, but you can catch up when they post audio and slide decks of all the presentations soon!)

A funny (and not “ha-ha”) thing happened at the conference

In the second last talk of the day, Mary Landrak* from ThinkPlace presented some examples and reflections from research in and outside of Australia. One of the questions she raised was about methods and truth, captured here in a tweet:

To demonstrate this challenge, Mary then showed us two slides. The first contained the title “Ice (un)breaker.” And she gave us instructions for the activity: stand up, find someone next to you, ideally a stranger, and then, follow the instructions on the next slide. The instructions were:

Discuss with the stranger next to you:

When was the last time you engaged in risky sexual behaviour.

Here’s how that went for me.

When we were asked to stand, one of a group of three young women sitting in the row in front of me volunteered to be my icebreaker partner. Very kind. The only other person in my row was four or five seats away.

When the icebreaker prompt was shown on screen, a lot of chatter broke out in the room. We looked at each other and both began to blush.

Me: Well, this is awkward…

Her: [silence, looks at the screen, then at me, then at the screen]

Me (and this is my first big mistake): Uh, [nervous laughter] you go first.

Her: [silence, looks toward the other women in her row, looks at feet, looks back at me]

Me: No, you don’t have to…

Her: Well, last…[quickly tells me an answer to the prompt, then looks away, then looks back at me with what appears to be a fair bit of anxiety on her face]

Me: Oh, wow… uh, um [thinks ‘oh crap, what do I do now?!?’]. Oh, time’s up. [nervous laughter, sits down, feels shame for several minutes]

It was a visceral experience of just how uncomfortable it can be to talk about something very intimate with a stranger.

The prompt is loaded with layers of meaning: it asks participants to not just discuss a sexual experience, but specifically to talk about “risky sexual behaviour.” That’s even more loaded with moral baggage. Framing sexual behaviour as “risky” already positions it as morally wrong and irresponsible. And risky sexual behaviour may or may not be consensual: being a victim of sexual assault could easily be construed as “engaging in risky sexual behaviour.” In terms of ‘intimate truths,’ this is heavy stuff.

Let me pause here to do what I should have done in the moment:

I apologize to the anonymous stranger who volunteered to join me in an icebreaker activity and who may feel embarrassed, intimidated, or worse as a result of the interaction. I wish I had been more aware in the moment of just how bad it was for me to say: “you go first.” I regretted it almost instantly, and am sorry.

In our society, an older male telling (not asking) a younger female stranger to “go first” in divulging private, intimate information is not cool. I don’t think I need to explain here how it reflects male privilege and patriarchy.

The ethics of the ice (un)breaker

I can see both sides of an argument about whether this activity was appropriate for a conference.**

On one hand, I appreciate the chance to experience first-hand the discomfort and stress that research participants might feel. I’m a big fan of experiential learning, and given that researchers are often more privileged in general than participants (and as I’ll discuss below are almost always so in the context of research), intellectually discussing the discomfort a participant might feel is not likely to be as effective. I hope all of those present will remember this experience and use it to have more empathy for the research participants we depend on for our livelihoods.

On the other hand, we were given no advance warning that we might face a difficult situation during the talk. None of us could have consented to the activity, because none of us had any prior information it was coming, and the context and manner in which it was presented gave little opportunity to opt-out.

Further, what active teaching might have accompanied this moment was lost in the din of a crowd of researchers chattering awkwardly (and for some, with mild outrage). This could have been executed much better than it was — and even with prior information and a reminder that we could opt-out, it may have been an effective active learning experience.

I do appreciate that this experience gives me an opportunity to reflect on the ethical challenges raised in researching sensitive issues, and to write something that may be useful to other researchers.

Making this meaningful

Taking today’s experience and applying it to research on sensitive topics in general, I think there are three key things to cover: informed consent, privilege and power, and the risk of harm to participants.

Informed consent

As far as I’m aware at the moment, it’s virtually universal that research involving humans requires participants to give informed consent before the research begins. People must never be compelled to participate in research activities (so, research with prisoners and other marginalised people is particularly high risk in ethical terms).

They must have sufficient capacity and information to freely agree to participate in the research (so research with children requires more careful scrutiny, and deception about the purpose, process, or topic should be avoided unless it is absolutely necessary and the benefits outweigh the risks).

Researchers must also always recognise and respect the right of research participants to not answer a question, to not participate in part of the research, or to end the interaction altogether. No means no, and stop means stop. And we must not penalise participants for exercising this right.

Privilege and power

As researchers in businesses, governments, non-profits, or academic institutions, we often feel less-than-powerful. We’re typically small (underfunded, underappreciated) fish in a big pond.

In interactions with research participants, things are different. Where participants are promised incentives, we may appear to hold the power over whether they receive the incentive — thus making people feel compelled to continue through discomfort or distress, even if we don’t intend it.

As representatives of corporations, governments, universities, or other institutions, we may also carry the air of authority and power by association. In those instances, we are the institution, and the participant is often much smaller and less powerful — they are patients, customers, citizens, employees; Davids to our (implied) Goliath.

We must always remember — as I obviously forgot in today’s icebreaker — the role of gender, age, race, status, education, position, ethnicity, etc, etc, in shaping the relationships, even (especially?) fleeting ones between researcher and participant. Are there social and/or cultural forces in play that disempower participants and empower researchers when it comes to the information produced in our interactions? Yes, always! And be careful you don’t exploit those.

We must remember that as researchers, we are not extracting confessions like the CIA or the Spanish Inquisition. We must behave in ways that realise the inherent human dignity of research participants. We have to hold ourselves to a higher standard in the context of research than we do even in our day to day lives.

Do no harm

The Do No Harm principle is huge and complex in itself. It’s the first of the seven Principles of Professional Responsibility of the American Anthropological Association (AAA) for good reason. It’s also an area of significant ambiguity, which allows for debate and should guide deep consideration for any human research project. I would argue it should also be a guiding principle of business and innovation, but that’s a bit out of scope for today.

For the AAA, researchers must carefully consider what harm might come of a project before starting any research, and they must continually assess whether the risk of harm changes during the course of a project. They also recommend that “anthropologists should not only avoid causing direct and immediate harm but also should weigh carefully the potential consequences and inadvertent impacts of their work.” This holds whether the research is a purely academic endeavour, or intended to bring about some sort of positive change in the world. Unintended consequences can be just as — if not more — harmful as intended outcomes.

Harm can be understood and measured in many different ways — physical harm like inflicting temporary pain or doing longer term physical damage are the easy ones. Much more difficult are cases where social, emotional, or psychological harm may be involved. Are you discussing things that a participant would be embarrassed if other people found out about them? Then you need to be extra careful with confidentiality, at the very least. Are you asking people to tell you about traumatic experiences from the recent or more distant past? Red flags here!

Even when research is intended to support outcomes that improve participants’ and other people’s well-being, researchers and our sponsors/clients must weigh the potential benefits of a research project against the risk of harm to participants. Academic research ethics boards often do this very conservatively, and in that setting research is most likely intended to produce social good. In business settings, we should be weighing the risk of potential harm to research participants and customers (heavily) against the potential benefits to society and the business.

So what do we do about ‘intimate truths’?

Getting back to the methodological and ethical question at hand — we know that there are many reasons to conduct research about topics or experiences that might be sensitive, uncomfortable, or inherently risky to discuss. So how do we go about it, ethically?

Can we get intimate truths in an exploratory interview? Maybe. Can we do it without imposing our privilege and exploiting participants, however subtle that may be? Hmmm.

We must respect the dignity of participants, and their right to informed consent and to withdraw from research at any time. We must also be prepared to mitigate harm done by the research we undertake — and not just in terms of liability to ourselves, our clients, or our institutions.

Where researchers need to uncover intimate truths about people’s behaviours, attitudes, and experiences, we must be prepared to invest time and energy in developing trust. Trust is what makes it possible for people to share accurate information with us about their intimate lives, their secrets, their insecurities. Trust — not power — helps us get at truth.

To develop trust, we almost always need to develop relationships and engage in reciprocity, with patience, sensitivity, humility, and most of all respect for the people whose lives we enter.

Fortunately, several of the talks at today’s conference highlighted these points. It’s a good start.

Updates:

*When I first published this essay, I also wrote to Mary directly to thank her for her presentation and tell her about my mixed feelings with the activity she used. I also wanted to send her a link so she could read what I had written.

Mary’s reply was thoughtful and professional, and she both clarified her intention and owned her misjudgement of the audience’s response. She acknowledged that the exercise had gone very differently for some of us than she had expected when planning it, and that she regrets the offence it caused.

She also explained that her intention was to highlight the complex challenges we face as researchers when we need to elicit and explore deeply personal experiences and behaviours. This activity was meant to be an engaging moment for the audience at the conference and a means to open up discussion.

With the benefit of hindsight, we can all say that this situation could have been constructed differently to achieve those goals.

**UX Australia, in their event wrap-up email, apologised to attendees for any discomfort caused by the activity. They noted in the email that they have processes in place to prevent this sort of thing, including asking “all speakers to discuss with us anything that might be sensitive or cause discomfort for attendees.”

They also wrote that “We have spoken to the speaker about it and she is not welcome to speak at one of our events again.”

This post originally appeared on Medium.com.

AAA Comments on Notice of Proposed Rule Making for IRBs

Post authored by Lise Dobrin (University of Virginia)

 

Below are some excerpts from the 18-page comment submitted by the AAA to the Office of Human Research Protections on January 6, 2016, in response to the proposed changes in the “Common Rule”, the federal regulations that motivate the system of research ethics review that is implemented by IRBs. The AAA comment was authored on the AAA’s behalf by Rena Lederman (Princeton University) and Lise Dobrin (University of Virginia). An overview of the Notice of Proposed Rulemaking (NPRM) and the full text of the AAA’s response can be found here.

 


On the NPRM’s proposal to expand the definition of “human subject” to include even non-identified biospecimens:

The American Anthropological Association is in general accord with the principle of “autonomy” (or “respect for persons”) underlying this NPRM proposal to change the definition of Human Subject. Anthropologists and their study participants have objected to the reduction of biospecimens to “data” (i.e., values detachable from their sources); they have pointed out that blood, tissue samples and the like can come to stand for persons and be invested with specific social, cultural, and ritual values.

 


On the problematic omission of sociocultural anthropology’s signature methods from both the Common Rule and the proposed rule change:

 Our first and most important general comment is that several of the proposed changes will deepen, rather than alleviate, ambiguity. This is especially true with respect to sociocultural anthropologists’ most characteristic research activity – “participant observation” (also referred to as “ethnographic fieldwork”, “fieldwork”, and similar terms) – which finds no place within the existing Common Rule at all. Insofar as the proposed changes likewise make no mention of participant observation, anthropologists and others who employ this approach—along with their IRBs—are left entirely in the dark. This situation promises to keep ethnographic field projects that rely on participant observation in “expedited” or “full board” categories when according to the logic behind the NPRM they should be “exempt” or “excluded”.

[A]nthropologists preparing to undertake “participant observation” do not understand themselves as conducting “interviews”. Instead, they are trained to appreciate that interviewing and participant observation are distinct (indeed methodologically opposed) activities: the former is an investigator-controlled interaction (the researcher asks a more or less predetermined set of questions keyed to his/her relatively well-defined research agenda) while the latter is basically a participant-controlled interaction (the researcher is responsive to his/her hosts’ agendas of activities, topics, and the like). This means that fieldworkers’ honest answers in response to [questions intended to establish exemption] would not enable them to meet the regulatory criteria for the exemption. This runs counter to the spirit of the NPRM proposal as we read it. We therefore ask that if the proposal to exempt low-risk research through interactions is adopted as described in the NPRM, “participant observation” be specifically listed among the activities that count as “low risk” for purposes of the exemption.

 


On the notion of “generalizability” as establishing a need for ethical oversight:

Reference to “generalizability” points to the heart of the problems inhering in the Common Rule definition of “research”. “Generalization” raises distinctive ethical problems within biomedicine as a result of the slippage between a doctor’s commitment to provide individual patients with personal care and the doctor-researcher’s commitment to socially-beneficial generalizing research. But it does not usefully diagnose a need for regulation across the board.

 


On the likelihood and severity of risk not greater than what is “ordinarily encountered in daily life” as a regulatory criterion:

Because researchers engaged in participant observation and ethnographic fieldwork interact with participants in the course of daily life on those persons’ own terms as a matter of methodological principle and because researchers do so without deception as a matter of ethical principle – that is, because they do not remove participants from their daily lives in order to engage in research or otherwise engage in research manipulations – these standard anthropological methods are consistent with the spirit of “minimal risk”, that is, they involve a “probability and magnitude of harm or discomfort” that is not greater that what is “ordinarily encountered in daily life”. Indeed, participant observation and ethnographic fieldwork more nearly approximate daily life than do any of the other activities currently listed on the OHRP expedited list, such as “surveys”, “interviews”, and “focus groups”. For that reason, we favor “excluding” participant observation/ethnographic fieldwork from the Common Rule.

 


On the need to determine confidentiality requirements on a case-by-case basis:

[W]hile anthropologists appreciate the importance of keeping data confidential when appropriate, it is not the case that information (e.g., stories or recorded texts) shared with investigators by participants in anthropological research should always be kept confidential along the lines of protected health information. To the contrary, one important purpose of anthropological research is to document knowledge for future use, whether by members of the community being investigated or by future researchers. The insistence on adherence to any privacy safeguards irrespective of the situation of research (including the wishes of the participants) runs counter to the context-sensitivity required for the ethical conduct of anthropological research, and in fact contradicts the AAA Statement on Ethics, which calls upon anthropologists to balance the protection of research participants and their communities with the careful preservation and judicious dissemination of their research records.

 


On the NPRM proposal to exempt low-risk research from IRB oversight:

We believe that anthropology’s most distinctive method of research, participant observation – if it does not fall completely within categories of “excluded” activity – falls within [the category of activities deemed “exempt”]: it involves the collection of information through open-ended interactions with participants in ways that are, as a matter of methodological principle, not under the researcher’s control but responsive to constraints imposed by study participants in their own daily life contexts. Because this exemption would support anthropologists’ ability to apply their professionally- and experientially-honed ethical judgment in an active, responsive, and situation-specific way as their research unfolds, the AAA supports this proposal, which decreases the burden on researchers to seek administrative review from those who have less knowledge about the risks to participants than they themselves do, while simultaneously diminishing (or at least not increasing) the risks associated with the research.

 


In response to the NPRM proposals about obtaining and documenting informed consent:

Among the proposals made in this section is one that would allow “a waiver of the requirement for a signed consent form if the subjects are members of a distinct cultural group or community for whom signing documents is not the norm” (FR 53977, 54055). The AAA supports this provision enthusiastically. Consent needs to be tailored to the social and cultural context of the research community if it is to be meaningfully informed. Yet despite the diversity of cultural settings in which anthropological work takes place, anthropologists frequently find themselves called upon by their IRBs to document consent in ways that make no sense to their study participants. In such situations signing consent forms offers the participants little in terms of protections, and unnecessarily burdens researchers who are caught between the expectations of their IRBs and the perspectives of their study populations. Moreover, the standard IRB-driven requirement for documentation of consent can interfere with rapport (that is, following local norms of relationship-building), a methodological prerequisite for effective ethnographic fieldwork.

We encourage Common Rule revisions to go even further, and explicitly recognize the need for “emergent” consent in the case of participant observation, where understandings of the research questions, and hence the potential risks and benefits of the research, develop dialogically (in culturally- appropriate encounters, usually by means of conversation) not only for participants but for researchers over the course of their interaction. For this reason, even when it is obtained orally, consent in anthropological fieldwork cannot be construed as an “event”, like listening to a script and agreeing to some or all of its terms.

 


On the awkwardness of continuing review for much anthropological research:

The NPRM proposes to eliminate the requirement of annual continuing reviews for “minimal risk” studies, i.e., those that qualify for expedited review. The NPRM also proposes to eliminate review of such minimal risk research once it has proceeded beyond data “collection” to data “analysis”, unless justification is given for why continuing review is called for. The AAA strongly supports these proposals, which help clarify the application of the Common Rule to anthropological research. It is standard practice for anthropologists to reflect on, analyze, and write about their field research experiences for years and even decades beyond the actual research encounters with study participants. This disciplinary norm makes it difficult or impossible for fieldworkers to identify a point at which their studies have ended.

 


On the category of “vulnerable populations”:

We applaud the NPRM’s proposal to clarify reference to “vulnerable populations” such that the consideration of relevance is specifically vulnerability to coercion. Categorizations of persons into a priori types cannot always be successfully applied across the board and apart from context; indeed, even reference to “children” must take into account local cultural conditions (e.g., the relevant distinction may be ritually initiated vs. uninitiated boys). The Common Rule specification of “vulnerable populations” may make sense in most situations in the U.S., but not in other settings where social categories may differ. Moreover, as is often emphasized by disability advocates, there is no reason to assume that being assigned to a given category of persons—those “physically disabled”, those “economically or educationally disadvantaged”, or any other group—necessarily puts individuals at risk of harm when those individuals engage in any area of life, research included. Indeed, the operative concerns—as the NPRM proposals attempt to acknowledge—are (1) the Belmont principle of Justice which calls for equitable opportunity to participate in research, and (2) the necessity for IRBs “to give consideration to [board] membership expertise” when they evaluate protocols involving study populations with which Americans are not generally familiar (CFR p.53989).

Abandoning informed consent?

Editor’s Note: Kirsten Bell was invited to contribute this post based on her recent American Anthropologist article, “Resisting Commensurability: Against Informed Consent as an Anthropological Virtue.”

Kirsten Bell

Department of Anthropology

University of British Columbia

 

There’s a courtroom scene in Rob Reiner’s film A Few Good Men (no, not “you can’t handle the truth”) where a physician has just provided damning testimony against the clients of Lieutenant Daniel Kaffee (Tom Cruise), Lieutenant Commander JoAnne Galloway (Demi Moore) and Lieutenant Sam Weinberg (Kevin Pollack).  Kaffee lodges his objection to the physician’s testimony and the judge overrules it.  Galloway then “strenuously objects”, which merely causes the judge to dig in his heels.  Afterwards, an annoyed Weinberg says to Galloway: “I strenuously object?  Is that how it works?  Hmm?  ‘Objection’.  ‘Overruled’.  ‘Oh no, no, no.  I strenuously object’.  ‘Oh, well, if you strenuously object then I should take some time to reconsider’”.  His point is that the formulation is both redundant and absurd.

In this post I want to consider another absurd redundancy, namely “fully informed consent”.  However, while “strenuously objecting” has never caught on as a legal strategy, “fully informed consent” makes an appearance in everything from the Stanford Encyclopedia of Philosophy to the AAA Code of Ethics (see “be open and honest regarding your work”).  Surely, consent is either informed or it isn’t, so why the qualifier?  In my view, its addition reveals the central problem with the concept itself, namely, that while it seems self-evident and straightforward, the more deeply you look, the murkier it becomes.  That vapid adverbs have been piled on in the hopes of clarifying its meaning (what’s next? “Really, truly informed consent”?) suggests that we are dealing with the fallacy of misplaced concreteness.

It’s worth bearing in mind that “informed consent” is a hypothetical construct.  Even the Belmont Report, the document responsible for enshrining the doctrine as part of a Holy Trinity of universal ethical principles, was hedging its bets on whether it was, in fact, possible.  As others before me have noted, it becomes especially meaningless when applied to anthropological research.  What on earth does it mean to state that, “Anthropologists have an obligation to ensure that research participants have freely granted consent”?  Freely granted consent to what?  To be studied?  To be written about?  (These are not the same thing.)  And how can one share “the expected outcomes [and] anticipated impacts of the research” when these are unknown at the outset?  Again, statements like these might seem self-evident, but upon closer inspection appear to be rather empty.

These are not new concerns.  What is relatively new is the AAA’s embrace of the informed consent doctrine 16 years ago and the lack of debate the concept engendered in the recent online discussions about the proposed revisions to the Code of Ethics.  All this would suggest that despite its problems, it has reached the status of an unassailable value.  After all, how can one be against informed consent?  That this has become our default response to critics of the doctrine suggests that it has become a totalizing frame, one that crowds out more a complex reading of the distinct issues it currently collapses, from community permissions to conduct fieldwork, to communication about one’s research on the ground, to the politics of writing and representation.  My suspicion is that untethering discussions about anthropological research ethics from the stranglehold of “informed consent” will allow us to talk in a more meaningful way about the complexities of communication about our research, which are far from the self-evident, straightforward, primarily technical transaction the doctrine implies.

In sum, despite its apparent unassailability, there are good reasons to take stock of our ethical equipment to ensure it does what think it does and what we want it to, otherwise we are in danger of ending up like misguided Vizzini (Wallace Shawn) in The Princess Bride, another Rob Reiner film.  After he misuses the term “inconceivable” for the dozenth time, Inigo Montoya (Mandy Patinkin), his hired swordsman, remarks: “You keep using that word.  I do not think it means what you think it means”.

The Ethics of Research on Facebook

AAA Committee on Ethics

Since Facebook and similar sites are explicitly public forums, does the analysis and use of imagery and text posted on social media sites require an informed consent process? Recently the AAA Committee on Ethics discussed this question in response to a query from a AAA member as to whether there exists a formal policy on ethical praxis and research with social media sites. The short answer: there is no such AAA policy or statement. So we asked current and former Committee on Ethics members for their views on this matter.

One commenter highlighted the fact that there are important distinctions in the terms of engagement people assume for public online forums vs. social media sites with restricted memberships. If there are thousands of members and anyone can join the conversation (meaning, you don’t have to be invited), then confidentiality is not an issue because the forum is a public arena, open for anyone to read the conversations, connect them to multiple areas of inquiry, and possibly quote them in their research or in other contexts. On the other hand, if membership in the group is restricted in some way, there would be a ethical problem if the researcher were pretending to be part of the community and then using its discussions without permission in her research.

Facebook poses an interesting dilemma in this regard because of the multi-tiered “friend” structure and multiple possibilities for security settings. There is legitimately no expectation of privacy on Facebook, yet, in practice, many users forget that.

As another committee member suggested, a reasonable person would not expect to find their Facebook comments reproduced in other contexts with interpretative frameworks applied to them. Thus, proposed research relying on Facebook content might prompt some human subject oversight committees (IRBs) to require that the researcher post an announcement of research intent on her Facebook page. Ethical praxis means active avoidance of deception or betrayal in the research/subject relationship, as well as avoidance of the perception of deception or betrayal. If a researcher does not tell anyone that she is collecting data for analysis and dissemination, that may amount to covert research.

A proactive way to confront these issues is to figure out a way to convey the message that users are researchers as well as participants in the social media forum. For example, the researcher might identify herself as a researcher on her own Facebook page, state that she is using quotes from the lists she belongs to and name those lists, and include a link to a pdf of her research design.

Other ethical issues arise when publishing quotes or imagery from such research. Given global access to “Google” and other search engines and software, it is far too easy to identify the original author of any Facebook or other social media post. Pseudonyms are inadequate. Best practice here would be to ask research subjects for permission to use direct quotes and imagery, indicating the context in which the material would be used, and giving subjects the opportunity to opt-out of direct quotation (in which case the researcher can always summarize or paraphrase).

For additional resources on this issue, check out these blogposts that explore the lack of consensus among researchers about the parameters and expectations of privacy and the boundaries of the public:

Michael Zimmer: Is It Ethical to Harvest Twitter Accounts without Consent?

Online Papers on “Research in the Facebook Era

See also federal advisory committee guidance on Internet research: Secretary’s Advisory Committee on Human Research Protections (SACHRP), “Considerations and Recommendations Concerning Internet Research and Human Subjects Research Regulations, with Revisions” (2013).