In 2004, Naomi Oreskes led an analysis of 928 papers on “global climate change” published in journals between 1993 and 2003, and she found that none of them disagreed with the consensus view that humans are causing global warming. Despite this finding, many Americans have been confused or doubtful about the scientific consensus of human-caused climate change over the years. Oreskes co-wrote Merchants of Doubt and The Collapse of Western Civilization, about the well-funded campaign against climate science and the catastrophic consequences of our own inaction, respectively. Her new book, Why Trust Science?, presents a case for the reliability of scientific consensus in a society where trust in institutions has fallen. She is a professor of the history of science and affiliated professor of Earth and planetary sciences at Harvard University.
The Saturday Evening Post: You talk about how our trust in science should be based less on the scientific method and more on the idea of consensus among scientists. Should we be teaching that in school?
Oreskes: I think — and there’s a large body of literature to support this — that the process of vetting claims is equally if not more important, because that’s, in a sense, where the epistemological rubber hits the road. Even if a scientist did follow the scientific method — had a hypothesis, did an experiment, the experiment worked out — there are so many possibilities for ways it could be wrong. Most egregiously, the scientist could be a fraud, but there are also a lot of less egregious things that could happen.
How do we know that a claim is legitimate? That’s where the rest of the scientific community comes in. A scientific claim is not accepted as scientific knowledge until it’s gone through this process of vetting by the rest of the community, which, in many cases, can be quite a lot of people. It’s that process that takes us to the point where we can say we’ve looked at a claim closely from a lot of different perspectives and we’re confident that it’s right.
Post: There’s this idea that “scientists are always changing their minds,” particularly when it comes to claims about diet and health. Where is the consensus on that?
Oreskes: This is a really important issue. First of all, the consensus on diet is much more robust than a lot of people think. I try not to beat up on journalists too much, but they have a lot of apologizing to do in this area. There has been so much irresponsible reporting on the issue of nutrition. We actually have a pretty clear idea about food. If you look at the typical American diet, we know the average American eats much too much meat, and that makes it more likely for people to have cardiovascular disease, type II diabetes, and colorectal cancer. There are a bunch of other things it may also contribute to, but those are the clearest ones. There is a huge body of evidence to support that claim. Now, we don’t have good evidence to say that you have to give up meat completely, but we do have very good evidence to say that if the average American were to eat much less meat they would, from a statistical standpoint, be much healthier. That’s almost unchallenged in the scientific community.
However, there are people who challenge it, and the media makes a big fuss about the people who do. Sadly, some of those people are supported by the meat industry. This is what we saw a few weeks ago. This encourages the exact sort of confusion you mentioned. If you’re an average person reading the newspaper, it seems like “last week red meat was bad, now it’s good. Who the heck knows, so I may as well just keep eating it.” In this case, I think we can say that’s exactly what the industry wanted. They want to create confusion — we know this from the history of tobacco — because confusion favors the status quo. So if you want people to keep eating a lot of red meat or to keep smoking or to keep driving big cars, one of the easiest ways to do that is to just confuse people.
Post: What would you say are some basic responsibilities of the media in reporting scientific stories?
Oreskes: The most important one, from my work, is knowing that science isn’t ever based on just one study. So, even if that study had been completely kashrut, and there’d been no industry bias, it would still behoove journalists to say “new study questions…” instead of “new study refutes conventional wisdom.” No one study can refute conventional wisdom because science just doesn’t work that way. Science is about bodies of evidence, bodies of data, ideally collected by many different people — to avoid bias and groupthink — ideally using different methods. Let’s say you had a study that was really methodologically robust and seemed to say “what we thought about X might not be right.” You should dig in and talk to other people in the field about how it was done and how likely it is that this study will really challenge our thinking. The vast majority of new studies don’t. The emphasis should be on that robust body of evidence.
Post: It feels as though there is a trend of distrust in science, given climate change skepticism and the anti-vaccine movement. Has trust in science waivered significantly lately, or does it only seem that way?
Oreskes: Mostly it only seems that way. We actually have quite good data from public opinion polls that show this, and, by and large, the vast majority of American people still trust in science. It is true that trust in experts of all kinds has declined since the 1960s, if you take the long view, but trust in science has actually fallen less than almost any other area, so trust in business, in government, in journalism — these have fallen dramatically since the ’60s. Relatively speaking, as a sector of society, scientists are doing well.
There are these conspicuous areas where we see the rejection of scientific findings by certain groups of people. It’s not a general distrust of science overall; it’s a rejection of findings in areas where people perceive that the findings of science conflict with their worldviews (political views, religious views, in some cases their economic interests). This is what sociologists call implicatory denial. We deny things because we don’t like their implications, and people do this in all aspects of their lives. What is special about our current period is the exploitation of implicatory denial. We now have organized networks of people who deliberately try to stoke doubt about climate change, evolutionary biology, and vaccination for political or economic reasons. This makes it difficult for scientists, because most of us aren’t interested in getting involved in a big, public, messy debate. But if you don’t get involved and explain to people what is going on, the American public will hear a lot of disinformation. When people hear something many times, they often will believe it’s true even if it’s completely false. Some of these groups know this.
Post: Do scientists have an obligation to act as their own “PR?”
Oreskes: I am sympathetic to scientists’ desire to do science, because that’s what they’re trained to do. A lot of scientists would prefer to be left alone to do their work. But I think we have to embrace a slightly different version of what constitutes “our work.” We have to accept that park of the work is explaining what we do to people.
Most science in America is funded by the American taxpayer. If we expect the taxpayers to pay for what we do, then we should also expect to spend some time explaining it, and explaining why it’s worthwhile and how the American people get their money’s worth many times over from scientific research. It’s in our own self-interest to do that. In addition, I think there’s a kind of social obligation, because if we don’t do that other people are quite happy to step in and create confusion. That leads to damaging results, like people failing to vaccinate their children and innocent children dying from preventable diseases.
Post: Should scientists be politicians?
Oreskes: By and large, no, because most scientists are not good politicians and most politicians are not good scientists. In general, no, scientists should be scientists. They should do the work for which they are trained and for which they have talent, but I think some adjustment of our conceptualization that were to incorporate a higher component of communication and outreach would be in order.
Post: As far as climate science is concerned, do you notice a trend of public figures accepting that we face a climate crisis while offering solutions that fail to address the scale of that crisis?
Oreskes: I think that’s correct. You can think about denial as not simply being an on-or-off switch, but as being a sort of spectrum. A lot of the work I’ve done is on hardcore denial, people who are completely rejecting scientific findings and generally doing it for motivated reasons. There is also what we could call “soft denial.” We’re seeing this now with politicians who will propose solutions that are nowhere near good enough or ambitious enough to actually address what’s going on. It’s not as pernicious as the hard denial that I’ve written about, but it is still damaging.
If you propose something really ambitious, you can be dismissed as being “unrealistic.” I think people who propose things that are inadequate are being really unrealistic too. There’s a way in which people like to pretend that they’re the grown-ups in the room. A lot of the time, that’s incredibly unrealistic, right? They’re pretending they’re being realistic, but they’re actually not because they’re not addressing the severity of the problem.
Featured image by Kayana Szymczak and Why Trust Science? © 2019 Princeton University Press. All Rights Reserved.
Become a Saturday Evening Post member and enjoy unlimited access. Subscribe now