Technology, AI and ethics.

Society will come to some new understanding, new norms, and maybe even new technologies to help us move out of the current phase of distrust

Society will come to some new understanding, new norms, and maybe even new technologies to help us move out of the current phase of distrust

 

An interview with Jeff Hancock, Professor in the Department of Communication at Stanford University

Alexander Görlach: The United States is headed into a Presidential election. This opens a huge spectrum of topics about social media and fake news.

Jeff Hancock: I worry that we have just gone through a bit of a moral panic about fake news. I think it’s really important that we have discussed it and been concerned with it. However, the research coming out of the last several months has shown that most of fake news was targeting a very small part of the population, and an even smaller part of that population did the majority of the sharing of that fake news.

It was targeted at older people and people that lean slightly conservative. I think we may have been overly concerned or concerned in a way that was partly useful because now we are much more aware of it but partly problematic. To some degree, we have undermined trust in media. That is the number one rule in the authoritarian handbook: undermining trust in other institutions. I am a little concerned about that, but I think the benefit is that we are now much more aware of being thoughtful about things we see in social media and online.

But I hope we have not overdone it to where people just don’t trust anything that the media says and are unwilling to hold governments accountable through the media.

Alexander Görlach: How do these sentiments come together with new technology? In times before it has always been part of the populist strategy to denounce elite journalists as being the enemy of the people. There are arguments around the globe now that this is much easier to do now with technology than it was in the past. Is this assessment correct?

Jeff Hancock: I think the reason there’s been a decline in elite or expertise has been more of a populist movement. A good amount of the people feels they have been left behind by the world order. I think that would be much more important than technology.

However, technology has made it so that everybody can have some kind of voice. This is the obvious thing and it was sort of the goal of Internet technologies. I think we are in a bit of a flex phase now where we have this capability of everybody able to say anything they want to almost any audience. We’re learning how to make sense of that new world.

One of the things I’m often struck by is that we have a very strong presentism bias. We often think that everything is changing and very bad right now; how could it get better? But we’ve seen these disruptions many times before, going back to the printing press which was super disruptive and really did change society organizations. We saw it with yellow journalism, which led to the rise of journalistic practices. And I think we are seeing it again. I’m not saying it’s not disruptive, but I think we as a society will come to some new understanding, new norms, and maybe even new technologies to help us move out of the current phase of distrust.

I think the reason there’s been a decline in elite or expertise has been more of a populist movement. A good amount of the people feels they have been left behind by the world order. I think that would be much more important than technology.

Alexander Görlach: It is interesting what you said about people feeling that they have been left behind. But suppose I answer to you that Steven Pinker, for instance, has said that the world has never been as good as it is now, or what about how many people in China have been lifted out of poverty in the last 25 years. In these times where the elites and journalists are denounced, we have the facts out there and we have interpretations of the facts, and that’s what communication is about. Why can we not find a consensus anymore on the easiest topics such as vaccinating our children?

Jeff Hancock: One of the concerns I have, is because these various things are happening at the same time, we confuse correlation with causation. The anti-vaccine movement is astonishing and I think that when we look a little deeper into that, it’s not from ignorance or polarization, which I think is a much bigger issue than technology.

When we look at the anti-vaccine movement, it’s that there’s this new form of parenting where children are the obsession and parents want to give them everything possible. It comes from actually fairly well-educated people that have a lot of resources and they put a lot of attention on their kids. So that’s a really unusual thing. And yes, I think the Internet plays a role there because they can find information that allows them to fit that parenting style, but it’s very different than say, refusing to believe what a Republican thinks because you’re a Democrat or vice versa. That I think is polarization and an increased sense of tribalism.

My colleague Shanto Iyengar has a study that pointed out in the 1960s if you asked someone if they would let their child marry someone of another race, almost all of them would say ‘No, I would feel very uncomfortable with that’. The good news is that now the reverse is true and most people would be fine with their child marrying someone of another race. But, if you say would you be okay with your child marrying someone of the opposite political party, you see the same levels. Over 80% would say they are really uncomfortable with that.

So I think we are seeing polarization play a really big part on political issues, but there are other things going on like this different kind of parenting approach and style that can be driving something else. Your point which is that technology plays a role in all of those is fair. I think it’s possible that now we can find confirming kinds of information, but I think that there are lots of dynamics at play that are happening at the same time.

I don’t think it’s a battle that is decisively won and we don’t have to worry about it again, but I think there are enough resources and assets being thrown against this problem that I would like to think it will be a match for the worst and most egregious form of information warfare.

Alexander Görlach: When you look at grouping, in terms of what we have seen in the last fifteen years, you have different outcomes. For instance, if you are in a small conservative countryside and realize you are gay, you realize you are not alone. We see now new groups and new forms of belonging. But on the other hand, if you are the only Neo-Nazi in a small village, people would say you are an idiot and you may refrain from what you are doing. Whereas nowadays, this is not the case. Is there a new quality to that?

Jeff Hancock: This is one of the most awful things about technology that I think is the most regrettable. There are probably very few people, for instance, in Palo Alto that are Nazi sympathizers, but they don’t need to be connected physically anymore to communicate. That is one of the central challenges and brings up a topic that I think is incredibly important.

My colleague Tarleton Gillespie raised the issue of content moderation. There are tens of thousands of content moderators that are asked to look at things like Nazi content, racist content, anti-Semitic hate violence, etc. It’s been sort of in the margins and done quietly. His argument is that actually, that’s central to what it means to be a society is deciding what is allowable in our community and what is not. He’s argued that we need to make content moderation much more public and much more central.

Alexander Görlach: I am interested in your outlook. We have spent the last three years looking at things like election interference, Cambridge Analytica, and Facebook. Have we learned enough in order to avoid the situation or improve it?

Jeff Hancock: I’m optimistic because I’m a Canadian, but my view is that there are as many resources as I would have ever hoped paying attention to this, from academia, tech companies, and the media. I think the only group not supportive is the Trump administration, but many other agencies within the US government and lots of other countries are fully on board. This will be an ongoing and evolving form of information warfare.

I don’t think it’s a battle that is decisively won and we don’t have to worry about it again, but I think there are enough resources and assets being thrown against this problem that I would like to think it will be a match for the worst and most egregious form of information warfare.

Share Post
Jeff Hancock

Jeff Hancock is a Professor in the Department of Communication at Stanford University and Founding Director of the Stanford Social Media Lab. His work is concerned with the psychology of technology, especially social media and AI, and how technology affects the way we think, feel, talk and relate to one another.

Show Comments