If you're looking for "I Spys," dating or LTRs, this is your scene.
View ProfilesPublished September 1, 2021 at 10:00 a.m.
When Barbara Hofer and Gale Sinatra submitted the first draft of their book Science Denial: Why It Happens and What to Do About It to their publisher in February 2020, the number of known COVID-19 cases in the U.S. was fewer than 100. By the time their book was published, in July 2021, more than 600,000 Americans had died from the virus.
Hofer and Sinatra, both research psychologists, know that many of those deaths were preventable. Had more Americans been adept at evaluating what they knew, what they believed, and how and where they got information to support their views, they might have chosen masking, social distancing and vaccinations and could have vastly improved their odds against the virus.
Hofer, a professor emerita of psychology at Middlebury College, and Sinatra, a professor of psychology and education at the University of Southern California's Rossier School of Education, never claim in their book that science is perfect, infallible or a panacea for all of the world's ills.
But, as the events of the last year and a half have demonstrated with tragic clarity, America's dearth of scientific literacy and critical-thinking skills has reached dangerous levels. When that deficiency is combined with the proliferation of online misinformation and disinformation and social media algorithms that reinforce ingrained worldviews, the consequences can be fatal.
For more than two decades, Hofer and Sinatra have been researching and writing about science, scientific literacy, and how humans think and acquire knowledge. They have coauthored several articles on those subjects, which appeared in peer-reviewed journals that target primarily other psychologists. But, as Hofer explained in a recent phone interview, it's never been more important to reach a broader audience with their message about countering science denial.
SEVEN DAYS: Humans have always been wary of things they don't fully understand. Has the problem of science denial gotten worse? Or are the stakes just so much higher today?
BARBARA HOFER: Both. In the book, we trace the history of science denial going back to Galileo and why people put him under house arrest for ideas that seemed so radical at the time. Think about Charles Darwin and how long it took for people to accept the idea of evolution. Even today, a large percentage of the population doesn't accept it. But the problem has definitely gotten worse, and the stakes are definitely higher than ever.
When we started writing this book a few years ago, I don't think we ever could have imagined that, by the time it came out, we would be seeing how deadly science denial can be. There are people who are denying to the grave what's going on, people on life support who are saying, "This couldn't possibly be COVID because it doesn't exist." That's just terribly disturbing. And here we are with climate change, at a point where we have to act now to make this work globally. Yet there are many people who still find that science problematic.
SD: There are current members of Congress who believe that 9/11 was a hoax and that a satanic cult operated an international pedophilia ring out of a pizza parlor. This feels like more than just a lack of critical-thinking skills. Is something more insidious at work?
BH: There are psychological reasons for why these kinds of ideas arise and for how they get amplified with the internet and social media. But we didn't want to make this an us-and-them issue. We also wanted to explain how we are all susceptible to some of these psychological tendencies.
Think about how some people are really troubled by the idea of eating genetically modified organisms, in spite of the lack of evidence that GMOs are unhealthy. Now, there are many other reasons why GMOs are problematic, as many Vermonters know well. Just as there are parents who don't vaccinate their kids because of their beliefs about natural parenting and not putting anything foreign in their kid's body.
This is not just a political issue. But many of these tendencies have been amplified by politicians and corporations that take advantage of the public's poor scientific literacy. The book Merchants of Doubt[: How a Handful of Scientists Obscured the Truth on Issues From Tobacco Smoke to Climate Change] talks about how the oil and tobacco industries figured out how to prey upon susceptibilities we all have to [get us to] question and mistrust science.
SD: Are some people more inclined to believe outlandish conspiracy theories than simpler and more reasonable explanations?
BH: I don't know if some people are more inclined than others, but certainly some people are more swayed by the psychological impulses. Or they're more invested in their social identity, which is a huge part of what's going on right now: "This is what my tribe believes. This is what my leaders believe. Therefore, I have to believe it."
We heard stories recently about people in Missouri who decided to get vaccinated in spite of their tribal identity as anti-vaxxers, and they were wearing disguises to the vaccination sites. We're all tribal people. That's part of being human.
SD: In the age of "alternative facts," how do we address science denial when people can't seem to agree on what is real?
BH: It's an enormous challenge. Some of this goes back to schooling and building scientific literacy. We have to help people develop what we call a "scientific attitude": What is a fact? How is it used? What evidence supports the fact I believe in? And is it accurate and correct? Those are fundamental tenets of science. We need to help people learn how to be open to new facts and be willing to change their minds in light of new evidence. It's a way of thinking and knowing that's critical for us to mentor in others, whether it's in the workplace or at school or at home with our children.
We also have to teach and model this with our kids so they have the courage to say, "Oh, look! I changed my mind about that," because scientists do this all the time.
SD: How do you recommend starting conversations with science skeptics?
BH: Having warm, honest and open conversations is key. And being a good listener. It would be very easy for me to hear a relative say, "Well, I'm just not going to get vaccinated" and be angry or just offer them a bunch of facts. That's pointless and won't change their mind.
I gave a public talk in Vermont on science denial a month before COVID hit. I had a person who contacted me afterwards who said, "I'd really like to meet with you. I'm the kind of person you described. I don't believe in a lot of things you talked about, and I'd like to explain why." And we had a lovely conversation over coffee about what blocked his acceptance of climate-change science and why he was so resistant to it.
SD: What was your approach?
BH: I tried to find common ground: Where are we connected? What values do we share? Because we're about the same age, I brought up the issue of grandchildren, and we each talked enthusiastically about our grandkids. And he softened. I said, "What about leaving them a healthy planet? I'm so worried about that." And he said, "I am, too."
And then he drilled down into the economics of it: "How are we going to fix this, and what's it going to cost?" There were deeper issues that I never would have gotten to if I had just written him off and thought, He's a climate-change denier, and I'm not, so what could we possibly have in common? We have a lot in common with everyone. We just have to find that place of commonality and work from there.
SD: Many of us know about confirmation bias, or the tendency to trust information that aligns with one's preexisting beliefs. With the internet and social media algorithms amplifying the problem, how do we check confirmation bias in ourselves?
BH: In his book Thinking, Fast and Slow, Daniel Kahneman talks about system-one and system-two thinking. System one is this quick, intuitive way of thinking. It's our gut-level response. System two is more reflective, analytical and logical. If you're making a quick judgment when you're driving about whether to brake or not, hurrah for system one. But if someone says, "Hydroxychloroquine will fix all your problems with COVID," you really want to slow down and think about what that means and whether there's evidence to support it.
Big Tech has a lot of responsibility here, as well, because social media is really problematic in reinforcing our confirmation bias. But I have been impressed by some of the changes that have already been made. For example, Twitter now has an automated response that, if you try to retweet something that you haven't opened, you'll get a response that says, "Would you like to read this before you forward it?" I've mentioned this to a number of people lately who said that message made them stop and think.
So we do not want to put all this responsibility on individuals. We need to resolve some of what's gone wrong, notably, with Facebook and why it's such a place for misinformation, and how we get it to better conform with evidence-based thinking.
This interview has been edited and condensed for clarity and length.
The original print version of this article was headlined "Truth Decay | New book examines how to inoculate ourselves against the epidemic of science denial"
Comments are closed.
From 2014-2020, Seven Days allowed readers to comment on all stories posted on our website. While we've appreciated the suggestions and insights, right now Seven Days is prioritizing our core mission — producing high-quality, responsible local journalism — over moderating online debates between readers.
To criticize, correct or praise our reporting, please send us a letter to the editor or send us a tip. We’ll check it out and report the results.
Online comments may return when we have better tech tools for managing them. Thanks for reading.