People acquire their scientific knowledge by consulting others who share their values and whom they therefore trust and understand. Usually, this strategy works just fine. We live in a science-communication environment richly stocked with accessible, consequential facts. As a result, groups with different values routinely converge on the best evidence for, say, the value of adding fluoride to water, or the harmlessness of mobile-phone radiation. The trouble starts when this communication environment fills up with toxic partisan meanings — ones that effectively announce that ‘if you are one of us, believe this; otherwise, we’ll know you are one of them’. In that situation, ordinary individuals’ lives will go better if their perceptions of societal risk conform with those of their group.
Yet when all citizens simultaneously follow this individually rational strategy of belief formation, their collective well-being will certainly suffer. Culturally polarized democracies are less likely to adopt polices that reflect the best available scientific evidence on matters — such as climate change — that profoundly affect their common interests.
Overcoming this dilemma requires collective strategies to protect the quality of the science-communication environment from the pollution of divisive cultural meanings. Psychology — along with anthropology, sociology, political science and economics — will play a part. But to apply the insights that social science has already given us, we will have to be smart enough to avoid reducing what we learn to catchy simplifications.