Technology, AI and ethics.

“It’s not about writing some new code or a new piece of policy. Change also involves shifts in human behavior.”

“It’s not about writing some new code or a new piece of policy. Change also involves shifts in human behavior.”

 

Interview with Deb Roy, Associate Professor at MIT

Alexander Görlach: We have been seeing a lot of discussions lately about what extent social media such as Twitter is responsible for opinion silos and the fragmentation of society. What do you make of this this claim in general?

Deb Roy: We can generalize the category to other social media platforms: Facebook, Instagram, etc. My belief is that they are a causal factor. They are not the whole story, but they are an important part of the story. They interact with other, more important parts of the narrative and these factors make it difficult to disentangle its exact role. My own experience with analyzing social media data and its interaction with news, as well as my own lived experience and common sense suggest that social media is playing a crucial factor, for better or for worse, in what is happening. It’s simply changing how we communicate with one another and understand each other in a significant way. 

There’s new mediators and new voices that have gained prominence in explaining to us what is happening in the world and all around us. In some cases, the change social media has on our life is dramatic. How can it not affect our politics and our culture? Everything that permeates through society is going to be affected.

Alexander Görlach: Technology always changes society, this is a given. So let’s try to make sense out of it. Perhaps only a quarter of a century ago, there were opinion silos. Someone may have only read a conservative or left-leaning newspaper, belong in a union, or be highly active in a tight-knit church group. Your social circles were very confined, just as today’s digital habitats are.

Can we compare these two circumstances?

Deb Roy: A key concept here is to what degree are we self-sorting into? Another more loaded way to refer to this: To what degree are we self-segregating? Here the word ‘self’ is key because typically, certainly in an American context, ‘segregation’ is a historically loaded phrase that alludes to the darkest elements of United States history. This idea of ‘self-segregation’, self-sorting, cocooning, comes in many different names.

There’s always been some friction and there’s a very natural instinct to self-sort amongst people. Why? We want to feel like we belong, we want to feel like we are with others who share our values and who appreciate us.

So there’s a lot of natural and good reasons to want to self-sort. It makes total sense for us to want a sense of belonging. For instance, even being loyal to a sports team is considered pretty healthy unless you start attacking other team fans. What’s changed in the last 25 years? The technology for letting us self-sort in ways prior to the Internet have been physically impossible are now upon us. You can’t fully control the politics and cultural interest of your neighbor or your colleague at work. There were various ways we would brush up against others and have a certain level of cross-cutting interactions that cut across different ideological or cultural divides.

We have to ask ourselves: What is happening? What have we lost? What have we gained? What do we think of that balance? If we don’t like the balance how can we make further changes and try to re-negotiate that balance?

Alexander Görlach: We read a lot about things fast-forwarding through digitalization, globalization, and the current axiomatic shift. Were we just more used to coping with one another before the technological shift in the ‘good old days’?

Deb Roy: Well, two things. I would not use the phrase ‘good old days’, because there was a lot that was problematic in the ‘old days’. However, I think something important has changed. It’s a mistake to simply say it’s simply new technology with the same old social interactions. Yes, the medium has changed, but it is more than surface level changes that we need to adapt to. There is a change in deep structure.

More and more is becoming automated. The digital age has AI-powered connection making capabilities that determine what messages and connections end up making it on your radar. It’s being tuned first and foremost to your expressed interests. This is machine-assisted self sorting. That increasingly allows you to simply tune out and not be exposed to contrasting viewpoints. Learning to deal with this other viewpoints does not mean that you even have to accept them, appreciate them, or change your own viewpoints.

However, if you significantly decrease your exposure to those viewpoints altogether, there’s less to adapt to. That sounds to me like more than a surface-level shift. It goes beyond physical.

Alexander Görlach: Since we have things catered to us, we may be losing our decision making abilities. Are we unlearning how to function in the social sphere, which involves learning how to balance between different points of view?

Deb Roy: Sure, and as usual since is the change is being driven by technology, the shifts in technology race ahead of human understanding. But, we humans do tend to catch up with the impact of our technology. There may be a gap, and in my opinion we are currently in one of those gaps where technology has led to personal and societal impacts but we are just now beginning to make sense and understand them.

We have to ask ourselves: What is happening? What have we lost? What have we gained? What do we think of that balance? If we don’t like the balance how can we make further changes and try to re-negotiate that balance?

Say some segments of the population miss having those cross-cutting views but the machinery we have built does not optimize for that. We can change that in various ways. If there’s a demand, the market will meet it with new ways to connect. Some existing communications may also shift in order to meet the demand.

There’s a lot of conversation and activity about regulation from governments. Germany is trying some pretty strong things with Facebook. There’s conversation happening in the United States which is likely to call for regulatory experiments. They are experimental because there still are a lot of open questions on how to even regulate this new technology.

Back to the keyword: ‘self’ and self-regulation. Will individual people opt out or start using certain platforms differently? Will we as groups make decisions for how we do and don’t want to communicate and interact? My guess is yes. We are not the only ones having this conversation, there is a growing awareness that something is not working.

Alexander Görlach: The new data driven economy depends on us consciously or unconsciously contributing our data. Would that be put on hold?

Deb Roy: I think we need to experiment. Every time the EU puts out a new constraint or policy for how the data-rich companies may or not use their data, that is an experiment. The world is watching. When Germany puts new regulations in effect regarding hate speech on Facebook, the rest of the world is watching to see if it works and makes things better and healthier. Is this a culture dependent solution? Do we copy and paste the solution?

Alexander Görlach: Yes, if one society creates a blueprint you can easily roll it out to other cultures and countries. One more thing I want to discuss are incidents like Cambridge Analytica. Data can cause abuse in power with these new technologies. Is there any chance to overcome this in say, the second Brexit referendum or the next United States Presidential election?

What have we learned in the past few years and are we ready to confront and battle those perpetrators in the web?

Deb Roy: Cambridge Analytica was for many people a wake up moment. A lot people who understand how this industry works were not so surprised by Cambridge Analytica, but were surprised by the scale of public reaction to it. What exactly here was so attention-worthy? The answer is: it was new and stunning for a large part of the population who now understood what the industry’s downsides and threats are.

Can I imagine doing things differently? Yes, I might say that I am an optimist. People who build technologies will start to compare them with what we value and continue to strive for improvement. That means we will continue to try new things and learn lessons from what’s working and what’s not. This isn’t the end. We aren’t stuck.

We should look at experiments that are happening in policies and how people as groups are deciding to shift their behavior around communication technologies. As we survey the landscape we will already see pockets of change happening. Maybe we can get little glimpses of the future.

There is obviously incredible power and impact of the technologies we design, which both tells me to be careful and understand the implications. But it also tells me that if we design new technologies and understandings of how to use them, we can continue to change the environment that we all live in.

The way I think about timelines is, if you just focus on the impact of digital media, that took only one human generation. The Web is thirty years old. It’s hard for me to see how you affect deep change to counteract something that took thirty years in less than two years.  You have to be realistic. It’s not about writing some new code or a new piece of policy. Change also involves shifts in human behavior. It has to permeate through human social networks. Shifts of that kind take longer.

On the other hand, you can say that thirty years is the blink of an eye. Look how different our mediated world is today than it was thirty years ago. I would argue that there has been a deep structural strange that happened incredible fast. I don’t think we will have to wait forever to see shifts happen again.

The old saying, “The future is already here, it’s just not evenly distributed” has some truth. We should look at experiments that are happening in policies and how people as groups are deciding to shift their behavior around communication technologies. As we survey the landscape we will already see pockets of change happening. Maybe we can get little glimpses of the future. We won’t see wholesale change in two years though, that is too short of a timeline.

Share Post
Deb Roy

Deb Roy is an Associate Professor at MIT where he directs the Laboratory for Social Machines (LSM) based at the Media Lab. His lab conducts research in applied machine learning and human-machine interaction with applications in children’s learning, social listening, and understanding large scale media ecosystems. Roy is also co-founder and Chairman of Cortico, a social venture that develops scalable media technologies and services to foster a healthy public sphere. Roy was co-founder and CEO of Bluefin Labs, a media analytics company that analyzed the interactions between television and social media at scale. Bluefin was acquired by Twitter in 2013, Twitter’s largest acquisition of the time. From 2013-2017 Roy served as Twitter’s Chief Media Scientist. An author of over 150 academic papers, his popular TED talk Birth of a Word presents his research on his son’s language development that led to new ideas in media analytics. A native of Canada, Roy received his Bachelor of Applied Science from the University of Waterloo and PhD in Media Arts and Sciences from MIT.

Show Comments