This article was also published by the NeuWrite SF column in Synapse, the UCSF student newspaper.
The chair squeaked quietly as I fidgeted, swiveling left, right, left. I sat toward the back of a long wooden table flanked by my fellow graduate students, while a pair of eminent biologists led a discussion on how to talk to skeptical non-scientists about evolution. Perhaps it was an intrinsic bias instilled by our families, or a side-effect of years spent steeping in the world of science; whatever the case, as we sat in that conference room we discussed those who dared to disbelieve as if they were at best misguided lambs to be converted and at worst heretical fanatics, frothing at the mouth and waving crosses wildly to fend off fish with feet.
The unspoken agreement around that table was that a person who didn’t trust or value the scientific consensus must be confused, overzealous, or just plain stupid. Except, I know these people. I’ve met them, I’ve talked to them—I’m even related to a handful of them, and they’re perfectly intelligent and totally sane. They just don’t have any reason to consider alternatives, because, as they understand it, evolution isn’t affecting them on a day-to-day basis. If I don’t need Jesus in my life, maybe they’re fine without Darwin.
That being said, there are topics that bear more relevance to everyday life than our familial relationship with monkeys. When it comes to climate change or vaccination, refusal to even consider scientific evidence can lead to large, sometimes costly problems.
The significant segment of society that rejects scientific evidence is a symptom of an even bigger issue: among the general public, there is a broad lack of understanding about science, while among scientists, few attempts are made to understand non-scientists’ diverse perspectives.
The canyon between scientists and non-scientists undermines productive conversation and limits society’s ability to solve pressing issues. It doesn’t seem like the gap is shrinking, so, short of some tectonic upheaval, we have to build a bridge. Scientists need to be capable of respectful discourse regardless of the audience, and the public needs to cultivate a better understanding of what science really is. Only by building this proverbial bridge from both sides will we be able to make progress at the intersection of science and society.
Easier said than done.
In the realm of science communication, recent years have seen a sharp uptick in “calls to action,” everyone’s favorite genre of think piece. These articles urge scientists to view communication as a “moral imperative”1 and to address the ongoing “communication crisis.”2 They have been effective to an extent, prompting changes to scientific training and a modest increase in communication to the public.3 But sheer volume isn’t enough. The problem has been misidentified as one of quantity, when really, it’s an issue of quality that’s impeding scientific communication today. Whether intentional or accidental, we scientists tend to twist every science-related conversation with non-scientists into an “us versus them” situation.
Where does this come from? Well, it can be traced to at least two different sources. First, we work and live in an academic bubble, tailoring our writing and presentations toward an audience of other scientists. Among scientists, it’s standard practice to find the weaknesses in every argument, to see where the authors of a paper may have made a logical leap that their evidence doesn’t support. Scientists often struggle to compartmentalize this way of thinking, and it leaks into daily life. When a conversation veers toward the scientific, it becomes difficult to refrain from immediately pouncing on fallacies and generally acting like an insufferable know-it-all. Most people are unfamiliar with the constant skepticism inherent to scientific discourse, so it comes across as condescension, or worse, antagonism.
In scientific conversation, we try not to take it personally when our peers point out flaws in our thinking. As we advance in our careers, we increasingly appreciate trusted colleagues’ critiques. Early on, though, it hurts. It’s deeply uncomfortable to challenge the things we believe, in the context of science and elsewhere. Somewhere along the line, most scientists develop immunity to this discomfort, and so forget that they might be inciting it in others.
Scientists’ know-it-all attitudes are exacerbated by the common classification of scientists as arbiters of objective truth, and herein lies the second disconnect in science communication. Science is usually communicated to the public without nuance, creating the impression that scientists are like human fact-generators.
Really, scientists are more like detectives, gathering evidence, developing theories, and identifying potential suspects. Most importantly, scientists often disagree with each other, challenging each other’s findings and gradually meandering toward the truth. Despite the dynamic nature of the scientific process, any hint of dissent among the scientific community undermines credibility in the public eye due to the perception that scientists ought to be producing concrete facts like those in textbooks. It’s a perception that some scientists internalize, occasionally mistaking their interpretation of the evidence for fact. The certainty these scientists hold that they have the most valid interpretation prevents them from considering why non-scientists hold diverse beliefs in the first place.
We all tend to hang onto whatever angle of an issue makes the most sense to us, scientists and non-scientists alike. We’re evolutionarily hard-wired to learn from our experiences and act based on them, even when they aren’t representative of the experiences of an entire population.
For example, a person might feel persistently ill, then go on a low-gluten diet and see a marked improvement in their condition. There are many possible interpretations of this evidence: it could be a result of the placebo effect, it could be an overall healthier diet that incorporates fewer complex sugars, or it could be a form of gluten intolerance. Someone who has seen reports about the evils of gluten in the media might choose to interpret their condition as a gluten intolerance, which is totally understandable regardless of whether their diagnosis is correct. However, they should be equipped with the ability to assess the evidence behind all possibilities and come to an informed conclusion, rather than being forced to rely on whatever explanation is sitting in front of them. A gluten-free diet is at worst a personal inconvenience, but the need for informed decision-making grows more important by orders of magnitude as issues start to have a greater impact on our lives and the people around us. As stakes increase, the tools to empower the public to take scientific decisions into their own hands become a necessity. The question is: how?
As scientists, a first step is to share our work with the public at every stage, even if it does not directly apply to everyday life. As it is, science is opaque, yielding neatly packaged stories that we ask the world to trust unquestioningly. But scientific evidence is rarely neat, and by glossing over failed experiments and complex results, we’re omitting important information, information that helps us rationalize our interpretations. Instead of giving people facts that are free of nuance, scientists need to be transparent about the process. Yes, we can tell people to read the studies and come to their own conclusions, but I’m a scientist and even I can’t make sense of upwards of 50% of the scientific literature. Rather than leaving the general public to struggle with opaque papers, we should make both our findings and our process public in a digestible format. Doing science is difficult, long, messy, and beautiful, a struggle to test every alternative and find evidence that will lead (a bit closer) to the truth. Understanding that process is vital to understanding why scientists believe their results.
We need to make ourselves available on an interpersonal level, whether it’s by volunteering at the local science museum, holding an Ask Me Anything on Reddit, or chatting with an Uber driver. Any opportunity to engage with a non-scientist is a chance to help them understand the nature of the work we do. Yes, you will get the same questions over and over (in my case, it’s some variation on “do we really only use 10% of our brains?”), and you may be tempted to correct without explanation (no, we use the whole thing). Instead, try to discuss why we think what we do (fMRI studies, mostly), and to figure out where misconceptions might come from (it turns out that the idea that we aren’t using our minds to their full potential suggests a miracle solution to everyone’s problems). The worst thing we can do is crush people’s curiosity. The best thing we can do is provide them with the tools to investigate for themselves.
Of course, the duration of an Uber ride or a few hours spent on Reddit aren’t enough to miraculously give someone a full grasp of the scientific process or the ability to evaluate journal articles. There is only so much that scientists can do to empower the public to take science-related decisions into their own hands. This is where the other half of that metaphorical bridge comes in. In order to gain a more complete understanding of science, the public needs to learn a set of skills only taught in graduate school—graduate school, that is, and grade school history classes.
More on that next time, in Failure to Communicate, Part 2: Primary School. We’ll take a closer look at how the school system contributes to the communication gap, and how it can be changed in ways that enable the public to take metaphorical bridge-building into their own hands.