Skip to main content
Free access to articles from EBSCO's Education Complete and Communication and Mass Media ends on August 31. Search and download research now!

Identity | Dan Kahan

Dan Kahan's identity-related work has focused on “identity protective cognition”, which refers to the tendency of individuals to unconsciously dismiss evidence that does not reflect the beliefs that predominate in their group. This is also sometimes called “motivated reasoning”. More broadly, he studies various topics related to risk perceptions and science communication in the context of decision-making. You can watch this short video or download the full interview transcript below.

"It’s really an important thing to study how the different kinds of reasoning styles interact. Is it the case that people who have a cultural world view have different attitudes about science when they’re in their science use domain. I think everybody has that to some extent."

Dan Kahan, Professor of Law and Psychology, Yale University
Dan Kahan

2017 Interview Highlights:

For the Cultural Cognition project that you’re doing, how does the measurement of cultural cognition interface with what constitutes identity?
I have a pretty broad understanding of the concept of identity. It’s any kind of affinity group in which the members are connected, unconsciously as often as consciously, by their adherence to some sort of distinctive set of values that then will systematically shape the way that they assess information about science.

And what kind of questions do you use to measure identity?
It’s really a two-dimensional scheme. One of the dimensions is, how individualistic or communitarian are you? And the other is hierarchy egalitarianism: how do you respond to different kinds of authority?

How is curiosity related to the concept of a STEM identity, and why is that important?
Lots of people are only modest in their abilities to comprehend scientific information but still have very high levels of science curiosity. There is something called the principle of perplexity. Every time you think you’ve gotten some data that helps you answer one question, there is conservation of complexity; you realize you have at least one more question that you don’t have any answer to, and so you have to keep going. I’m not sure what the relationship is between science curiosity and going into STEM programs. But if curiosity does motivate people to do that, then they’re going to have this resistance. But it might be just because they’re curious people, not because they happen to be scientists.

What do you think about situations when people choose to use a particular identity, and what factors make them likely to draw on one identity versus another?
Well, people do different things with scientific information. One thing they might do is cherry pick it or read it in ways that support the kinds of factual claims that have become identified with their cultural view. But at least some of those people may do something else in their life in which they use science to guide them in making accurate decisions. There is literature on this. There are scientists who don’t believe in evolution but still study it or use parts of the evolutionary theory in their work. Maybe they’re doctors looking at things like genetic propensity to diseases. In effect—and this is work by Salman Hameed—they think one way about the theory of evolution when they’re at work, where they use it to be good doctors, and another way about it at home, being good Muslims, where that role is enabled by a certain kind of skepticism.

Can you talk about how sometimes identity and science knowledge interact in a way where they have the opposite effect from what you might think?
A perfectly plausible hypothesis would be that the reason we have disputes and confusion and conflicts about risks like climate change is that people don’t know a lot of science, and we don’t think the way that scientists do. Members of the public tend to think of things in a rapid, intuitive, largely unconscious way, it’s a fast way of thinking that Daniel Kahneman describes as “System 1.” In contrast, scientists are more analytic, deliberate, and conscious. They use the slow kind of reasoning that Kahneman describes as “System 2.” You might think that people who use System 2 understand the evidence better and can reason about it. They’re going to basically go with the evidence, whereas the people who don’t understand how to think that way are likely to rely on heuristics like “What do my friends think?” and become polarized.

But we have done, as have others, lots of studies that show that the people who are the most polarized are also the most cognitively proficient—the ones who are mostly likely to be using System 2. At that point, you have to reevaluate what you thought was going on when people were forming these identity-expressive or identity-protective kinds of risk perceptions. They’re not making a mistake. The problem isn’t that they’re irrational, the problem is that they’re too rational. They’re engaging this information in a way that’s most relevant to their lives, and nobody, no individual member of the public, can do anything about climate change. But if they make a mistake in their own community, given that climate change has now become a symbol of group loyalty, they could be in a lot of trouble. It’s a kind of System 2 motivated reasoning. That’s something we’ve been studying about identity, including with experiments that will catch people in the act of using these remarkable and beautiful capacities to reason well in a way that perpetuates their belief in the position that’s dominant in their group.

Download full interview

Return to Identity homepage