When Bad Actors Hijack Good Research – Medscape

On May 14, 2022, a White gunman opened fire in a Tops supermarket in Buffalo, New York, killing ten Black people in a racist mass shooting. Soon after, he posted a 180-page diatribe online in which he attempted to justify the killings. In the document, the gunman cited scientific “evidence” he claimed backed up his views.

One paper he referred to was a 2018 study in Nature Genetics that explored possible links between genetic factors and educational attainment. The shooter cited the article to “prove” that White people are genetically predisposed to be smarter than Black people — but this was not remotely what the article said. The study didn’t compare racial groups at all, and it said that a person’s environment plays a far greater role than their genes in shaping their educational outcome.

That Nature Genetics article was twisted to particularly violent ends, but scientific research is distorted and misrepresented all the time — and the problem is worsening. In a social climate in which pseudoscience and conspiracy theories are gaining traction more rapidly than ever before, misuse and manipulation of scientific data are major threats. Experts are exploring what they can do to fight their spread — and whose responsibility the battle should be.

Sensitive Research Areas

Plenty of fields have fallen prey to this kind of scientific hijacking. Conservative bloggers and news outlets such as the National Review and Breitbart News have misrepresented climate data, for example, using distorted charts to minimize the rise in global temperatures and to discount climate change. In medicine, the discredited former physician Andrew Wakefield is known to be the author of the fraudulent article that purportedly linked autism to the MMR vaccine in 1998 — and though his work was refuted and his license revoked, his claims have fueled the modern antivaccine movement.

“Science is littered with these kinds of examples,” says Michelle Meyer, PhD, a research ethicist at the Geisinger Center for Translational Bioethics and Health Care Policy. “Knowledge can almost always be used for good and for bad.”

Some areas are riskier than others, Meyer says, but there’s no simple definition of “sensitive research.” It usually refers to research with potentially harmful societal consequences or implications. That could be research in race, sexuality, personal privacy, civil rights, or dozens of other areas — usually within the social sciences.

The risks are not always obvious. For instance, sociologists might wonder whether certain characteristics are associated with a person who supports labor unions. That may seem innocuous, but Meyer imagines a scenario in which a company applies machine learning to use attributes or demographics to predict whether somebody supports a union, then sells that to human resources companies to screen out pro-union hires. “Any time you identify associations, people can use those in malevolent ways,” she explains.

Genetics has another layer of ethical liability. There’s already a long, dark history in genetics, says Robbee Wedow, PhD, a sociologist at the Broad Institute of MIT and Harvard who co-authored the 2018 study cited by the Buffalo shooter. The pseudoscientific eugenics movement, which misapplies genetic principles to claim that certain groups of people are “unfit” and that only “higher races” can be successful, dates back to the 1880s.

Any historically marginalized trait, any trait with current social importance, any trait linked to a stereotype — these are all “ethically fraught,” says Meyer. When you start comparing these traits between different genetic ancestry groups, she says — in other words, different races— that’s when researchers must tread the most carefully.

How Misappropriation Happens

“Misappropriation is multiple things,” says Aaron Panofsky, PhD, a sociologist of science at the University of California, Los Angeles. Some groups, he says — specifically, White nationalists — stoke the flames of the culture war, discrediting mainstream research by claiming that scientists can’t report honestly for fear of being “canceled.” Others put researchers with contrarian views on a pedestal, lionizing discredited researchers such as Wakefield or the few scientists who question human-caused climate change.

Some create their own branch of purely fictional science, through data both real and made up, he adds. These include activists pushing the antiparasitic medication ivermectin to treat COVID-19; creationist museums using their own so-called fossils and laboratories to contest evolutionary science; and “flat-Earthers” using far-fetched “experiments” to try to prove that the earth is shaped like a frisbee.

Often, says Panofsky, groups misappropriate research by cherry-picking particular ideas, images, or datasets and collecting them to build up a body of “research” that serves their own purposes. These bits and pieces are authentic but become totally decontextualized. “These curation efforts are really powerful mobilizing tools for these bad people, because they don’t have to do the work themselves,” he says.

The Buffalo gunman was heavily influenced by extremist groups on the online forum 4chan who had already amassed a slew of decontextualized facts and wholesale fictions to sell their cause — namely, the increasingly mainstream “great replacement theory,” which has existed for decades and claims there is a scheme to replace White Americans with immigrants. Its proponents’ goal is to incite violent action. Their methods include citing data on the decline in White American birth rates, the increase in multiracial families having non-White babies, and an aging White population. These data are real, but they have been reframed and wildly misrepresented in order to suggest a conspiracy and to justify extremism.

Do Scientists Bear Responsibility?

Some scientists believe in the purity of their research, says Andrew Beers, a fourth-year PhD student researching social media misinformation and disinformation at the University of Washington. It’s a long-cherished idea in the scientific community: that the data speak for itself, that they need no commentary or contextualization. But many experts disagree, saying that scientists need to shift their mindset and embrace public understanding of their research as part of their job. For Meyer, that means trying to anticipate ways in which things might be misunderstood.

“The way the science hits the world is part of the science,” Panofsky says. Wedow agrees, adding that a scientist’s job is no longer to simply create the science without a stake in what happens afterward. “I think that’s incredibly irresponsible.”

Modern researchers should think about the social context of their work, says Beers, who leads a project on the misappropriation of mask-related science during the pandemic. A positionality statement, for example, is a blurb accompanying a published article that aims to clarify the researchers’ identities, potential biases, or social stances — for instance, if an all-White group of authors is drawing conclusions about Asian Americans.

As scientific conspiracy theories and misinformation loom larger in the public consciousness, says Beers, more researchers are advocating for the use of these statements in order to reduce the likelihood that readers will misinterpret an article’s intentions. For Beers, this means making it clear in his articles that no matter what else he’s reporting, he believes the overwhelming evidence shows that masks work.

Responsible Communication

Academic researchers are under great pressure to publish groundbreaking material. Ideally, says Meyer, a null result would be just as publishable as a positive one — but in the real world, that’s not the case. “There are built-in incentives for scientists to hype their results,” she says — and in the social sciences, that might make research more attractive to bad actors. “That is a powder keg when it comes to this type of research, which can be easily misunderstood and misappropriated,” says Meyer.

To defuse the risk, researchers have to be as clear as possible about their findings — not only for fellow experts but for lay people as well. Meyer suggests that researchers embed plain language summaries within articles in which they clearly outline not only the studies’ findings but also their limitations. They can also write FAQs that serve as a go-between for researchers and the media, such as the one written by the authors of the 2018 genetics article.

Notably, that didn’t prevent the Buffalo shooting. But the authors hope it might weaken the shooter’s message, delegitimize his views, or prevent others from being swayed.

Scientists should embrace opportunities to involve the communities they’re writing about, says Wedow. He and his team wrote a 2019 article on the genetics of same-sex sexual behavior. Before the article was published, they collaborated with several LGBTQIA+ advocacy organizations through Sense About Science, a nonprofit organization dedicated to improving evidence-based public discourse of science.

The organizations had the opportunity to see drafts of the paper and alert the researchers to how certain marginalized groups might be affected. The organizations advised the authors on the study’s limitations and nuances, which were then addressed in the published study.

Because of that advice, the authors included a statement explaining that they had to exclude from their analysis those individuals whose biological sex and self-identified sex/gender did not match. That means it excludes certain key groups within the queer community — transgender people, for example.

Wedow’s team also created a lay-friendly website and press release for the article whereby they state that there’s no “gay gene” that determines whether a person will have same-sex partners. It’s a lot of work on top of the actual science, but in theory, the extra steps make it harder for anyone to misrepresent the data.

Engaging With “Bad Actors”

The internet and social media, says Panofsky, add a layer of complication to the problem. “Facebook made everyone crazy,” he says. Nowadays, everyone has a platform that is basically unlimited, and the wilder and more contrarian someone’s claims, the larger an audience they can accrue.

The dilemma for scientists is whether to respond to false claims or not. If you engage — either on Twitter, or with a letter to an editor, or any other way — you run the risk of lending credence to positions that are plainly wrong. “You know the ‘Streisand effect’ — denounce your enemy and it gives them attention,” Panofsky says.

People who haven’t yet formed strong opinions are an important piece of the puzzle. “Trying to convince a racist not to be a racist? They’re probably the wrong target,” says Panofsky. “But there’s a big group of ordinary people in the middle who don’t really know what to think about it [ie, a sensitive issue], or never gave it much thought.” These are the people who might be convinced by a confidently stated lie — but may also still be open to the truth.

Extremists, who are used to being in the vocal minority, often come armed with pseudoscientific tools and talking points to back up their arguments, Panofsky says. Scientific advocates, on the other hand, are less equipped for information warfare. “[Right-wing extremists] are arming their people with the tools to fight this ideological fight, and progressives don’t necessarily engage science in that same kind of serious way,” he says.

“There’s a tactical question about what’s the best way to do it, but I’m pretty certain the best strategy is to find ways of counteracting [extremism] that are more than just biology class in high school,” Panofsky says. A comprehensive approach might start early, with media literacy education in schools. Content moderation policies on social media are another powerful tool to limit exposure to extremism — although those policies have themselves become culture war flashpoints.

Restricting Data Access: A Double-Edged Sword

Recently, there’s been a big push for transparency in the sciences, Panofsky says. Everyone, the thinking goes, should be able to see the data. Just last month, the Biden administration banned paywalls on all taxpayer-funded research. On the one hand, this openness could be a powerful way to engender trust in the scientific establishment at a time when trust is at a low point. But openness comes with its own problems.

“Those values are great, and I’m all for them, but it turns out that openness can be debated and corrupted and turned against itself,” argues Panofsky. “I think that’s happening in the case of these [extremists.]” Scientists working on sensitive topics should be cautious about open access, he suggests; after all, extremists can’t misuse data if they can’t see them.

But gatekeeping scientific data rubs some experts the wrong way. “Policing knowledge production and the flow of information — that makes me uncomfortable,” says Meyer. “And I worry about the message it sends. I feel it will play directly into the hands of people who already view science and the academy as far-left indoctrination machines.

“The more you start censoring things, especially if it’s publicly funded, the more you’re really giving them ammunition,” she adds.

It’s a double bind. The way out, Panofsky suggests, is by placing scientific data in context. Misappropriation happens when people take data out of context, so the goal should be to make that harder. Perhaps, Panofsky says, the solution involves teaching people the critical importance of the context surrounding any given claim — maybe even requiring them to take a test to assess their contextualizing skills before having access to a dataset. “How could we build systems that had some kind of ‘context competence’ as part of the requirements for accessing it?” he wonders.

In fact, restricted access could add to the problem of decontextualization, notes Beers. Someone making a false claim can link directly to a primary source, he says, but unless the full article is available via open access, one can only read the abstract, which is often written hyperbolically and without the nuance that may come later in the article. “For instance, some abstracts really seem to say that masks don’t work,” he says. “But when you read the paper, it’s actually heavily qualified, and they work in certain circumstances.”

The Role of Publishers and Funding Bodies

Once a researcher’s work is hijacked, says Beers, it can be difficult to undo the damage. Bad actors cite outdated research all the time, and there’s nothing anyone can do to put the toothpaste back in the tube. But publishers could make it easier for scientists to add context to old work, says Beers.

Epidemiologist Raina MacIntyre wrote an articl in 2015 stating that cloth masks should not be used by hospital workers. Current conspiracy theories have decontextualized the article to undermine the use of mask-wearing altogether.

But MacIntyre’s work since the pandemic shows that she’s very much in favor of mask-wearing to prevent COVID-19 transmission — including high-quality cloth masks. She’s written over a dozen articles saying that masks are effective in preventing infection. She’s sent letters to the editor qualifying her 2015 results, and she’s published an entire re-analysis that contextualizes the article, says Beers. And yet, the 2015 article remains what Beers calls the “Bible of antimask studies”—in fact, the article has been Tweeted over 46,000 times.

Right now, when you visit the page hosting the so-called antimask article, there’s no way to tell that MacIntyre has done any of that other work. Her more recent material, Beers says, is not clearly linked on the websites where it’s hosted (PubMed, for example).

Readers of the 2015 article should be immediately directed to a big red box alerting them to the fact that she has painstakingly reframed those results in the context of COVID-19, Beers argues. “These scientists don’t have a chance to go back and contextualize their work very easily.”

Journals, he says, should encourage authors embroiled in a controversy to submit public contextualizing statements. They should display those statements prominently, with updated website interfaces. It would be a huge undertaking, but worth the effort — and some technical work is already being done to make it a reality, he says.

But even if journals encourage researchers to contextualize articles and discredit misleading claims, the researchers may not want to; every moment spent fighting disinformation is a moment not spent actually doing research. “Traditional incentives are not well aligned with this kind of work. You don’t get tenure for writing letters to the editor.”

Some researchers may be able to build funding for communication and mitigation into their grants. But structurally, says Wedow, researchers are not given enough resources for outreach and communication. “I think the NIH and NSF need to start taking these things really seriously and making money and room and time for these types of endeavors in the funding process.”

Teaching Science Differently

Many people learn early on that atoms are little round objects, that humans evolved from apes — or that our genes determine our traits. None of these things are entirely true, but simplification is a core element of science education.

It may be a convenient way to explain complicated concepts to children, but in the case of genetics, it might be setting children up to be misled later on. “You think of Mendel’s pea plant in high school — no human outcomes work like that,” says Wedow.

Unfortunately, he says, learning genetic determinism in school makes it hard for people to understand that genetics are more about probability and statistics than they are about causation. “Those Punnett squares are really good pedagogical devices, but now a lot of people think there’s a gene for IQ, or there’s a gene for who’s a better athlete,” he says. “Because of that, people can easily misuse the data to appropriate racist ideas. And indeed, that’s exactly what happened with the Buffalo shooter.”

Science education, just like scientific research, has a social context. A greater focus on scientific and digital media literacy in schools might better equip students for the real world.

A New Generation of Scientists

There’s no “slam dunk” solution for preventing research hijacking, says Panofsky. But experts hope that the research community — individuals and institutions alike — can make a difference through a multipronged approach. “If you have someone who is hell-bent on doing something, it’s really hard to stop them,” says Meyer. “But that doesn’t mean that you can’t make it harder and more costly for them.”

Equipping scientists to combat the misuse of their work is a long journey, says Wedow, that probably won’t be solved in this lifetime. But he is optimistic that the next generation of scientists will take research misappropriation more seriously. “In some ways, on the heels of #metoo and Black Lives Matter, scientists are being called into action in a different way than they were before,” he says.

For more news, follow Medscape on Facebook, Twitter, Instagram, and YouTube.