Interview with Richard Gingras, Vice President of News at Google
Alexander Görlach: As you know, ConditioHumana.io is about technology but with a humanist approach. This relates well to your work in journalism, media, and ethics.
My first question is, with your work at Google and beyond, what is your biggest finding or most interesting assessment about ethics?
Richard Gingras: First of all, congrats on the magazine! I think it’s a great idea; the impacts of these evolving technologies are significant and they are far-reaching. They go far beyond the more visible players like the Googles of the world. They’re in our medical industries; they’re in our financial industries. It’s pervasive and it’s crucial that we not only recognize those impacts and look to make sure that they are properly executed with the broad interest in mind.
Why do I work at Google? I love the company, but I have been working in the digital media space since 1979, that’s 40 years. I’ve long been fascinated with the evolution of technology and media ecosystems. My efforts make sense to me because our platform is the open environment of the web and Internet.
Among the many challenges that I feel we have is: How do we properly evolve those ecosystems particularly given the challenges we face? How do we resolve an open ecosystem of access and publishing? Free expression can present its own challenges and problems. The very openness of the markets can present challenges to those who had fiercely more powerful positions in the relatively constrained distribution environments of the past.
How do we take and balance out the evolution of technology like machine learning and artificial intelligence to make sure that there are true and deep benefits to our societies and not rife with nefarious efforts?
Even folks in the technology space I think often have an inflated sense of what they’re trying to do, sometimes appropriate and sometimes not. Your technology has value, but it doesn’t have values.
How do we take and balance out the evolution of technology like machine learning and artificial intelligence to make sure that there are true and deep benefits to our societies and not rife with nefarious efforts?
Alexander Görlach: Previously you have said that the most inspiring and invigorating aspect with tech and media in 40 years is the accessibility of the ecosystem. The markets have opened tremendously. In this market, there must be rules and there must be values. That’s already a value in itself. There has been a debate between Apple and Google about having a closed internet or an open source internet. This is a struggle about values, isn’t it?
Richard Gingras: It is, it’s a struggle about values and norms. In a sense one of the challenges of living in a democracy is how do democracies survive and thrive in an environment of unfettered free expression? I live in the United States. We’ve got the First Amendment which is as strict a codification of free expression as we have. I am a board member of the First Amendment Coalition. So I don’t say that to suggest that they should be constrained but to suggest that institutions from tech companies to governments, politicians, and the media ecosystem need to be cognizant of what norms we are trying to establish.
There’s this interesting metaphor that somebody raised recently that I find so incredibly poignant. Is the internet to America’s First Amendment what the AK-47 is for the Second Amendment? The point being we have laws that allow broad free expression. We have laws that allow people to own guns in most parts of the world, right? But then the question becomes how do we evolve the societal norms that allow society to balance between what is allowed versus what one should actually do? Not all of these things can be specifically regulated and controlled by law or nor do we want them to be.
That’s the big problem with free expression, is people say well, let’s look to see how we can control that. But as you know that’s enormously tricky and has secondary consequences. I think it has enormous ethical consequences in terms of how our societies work because who decides what expression should be allowed?
Alexander Görlach: There are plenty of controversies coming from an offline space into the online space. Such as Facebook wanting you to use your real name and not being anonymous because that has implications on how you behave. There is an increasing machinery of algorithms that suggest things to me out of aggregated data. This is triggering new questions about the new levels of ethical endeavors.
Richard Gingras: I think it’s clear, there are no simple answers and there are no binary answers. In fact this is one of my bigger concerns as I travel the world on these matters, because part of my work is speaking with public policy people. I think probably the most significant damage of the whole notion of ‘fake news’ is the prospect of having laws, many well-intended, some less so, have secondary consequences impacting free expression and open markets. These are very tricky questions. And so how do we manage that? How do we manage what we clearly can see is nefarious behavior in terms of the inappropriate use of data? How do we advance better approaches? The answer may be regulatory or may be not in terms of how we manage data. How do we drive broader adoption and awareness with regard to algorithmic approaches and artificial learning approaches?
There’s a set of three-step core principles that algorithmic players should abide to. In this case I’m speaking very specifically to things like personalization and user awareness of what is being personalized to them. Does the user have an opportunity to control the degrees of personalization? Are there methods for third parties to assess how these systems are performing? I talk about the algorithmic efforts of search engines to say that it’s important we be clear about our principles and as clear about our methods as security allows. There you get into combinations of theoretical regulation and combinations of how do we drive norms and set examples of approaches that others can follow.
Is the internet to America’s First Amendment what the AK-47 is for the Second Amendment?
Alexander Görlach: The technological advances of the last 10 or 20 years have led to a crisis of the political system. The critique on democracies is partly driven by media biases. To me this is a bit of a ‘chicken and the egg’ problem that I want to disentangle.
Media is showing the diversity of societies. People who like monolithic ideas of society will never really like the idea of a pluralist media. There is an intrinsic mechanism in technology and media that seems to be fostering the polarization in society to one degree or the other.
Richard Gingras: As humans, we have this intellectual weakness to prefer affirmation to information. If there is a truth about the Internet, it is has provided such a vast array of sources of expressions so that is very easy to find information that will support whatever you believe, from the most heinous to the most angelic. You will find it online. This isn’t just the result of nefarious players. This is to some extent the result of, as the MIT studies pointed out, partisan content and partisan press. No one can settle on what makes good journalism but it is up to these institutions to decide how we drive a reconsideration of practices, approaches and norms.
My favorite definition of journalism is to give citizens tools and information to be good citizens. Help them understand how to think, not tell them what to think. Journalism in many ways is not doing that today.
Alexander Görlach: Why do you think that is? What is different compared to analogue times? You can read one newspaper that is more conservative or another that is more left-leaning, so that has already existed.
Richard Gingras: In many ways there is nothing new here. We’ve had partisan press forever. For instance, in the United States I can look to the Federalist Papers. It’s nothing new. What is new is the broadness of free expression that the internet enables and the opportunity to amplify. These issues were issues before, just look at the history of tabloid newspapers in relation to politics.
Alexander Görlach: The German tabloid Die Bild used to put question marks at the end of their headlines when they knew they were just messing with the facts. However you used to have large disagreements from different political spheres about how to interpret the facts, but they would still adhere to the same set of facts. However, now you have people against vaccines and the Flat Earth Society. To me that is a key point of difference.
Richard Gingras: That goes back to the point about the internet allows you to find the most extreme views. It has made the problem worse and made the opportunity and reality of creating silos of alternate fact-based realities of an issue.
If we want our societies to work, if you don’t want it to drift down to winner-take-all, then I think it is up to our institutions, including the political space. However, whether this is realistic or not, we have to step up and say that we cannot operate that way. We have to think: how do we evolve these approaches? Sometimes I think I am naive saying that, but that doesn’t mean that we can’t strive towards it.
How do we evolve models of journalism that I can give people the information they need to understand the issues and form an opinion? One of the challenges with journalism is that it focuses on anomalistic events. Something bad happened and it’s something that doesn’t happen often. But as you know, it’s not hard to use that information to incite fears. It’s not hard to drive perceptions beyond reality just based on disproportionate news coverage.
Alexander Görlach: There was Swedish doctor named Hans Rosling who discussed this in his book Factfullness. He explained that we seem to be bound much more to the negativity of an event moreso than the positive because of intrinsic, evolutionary reasons. A negative event is a call to action to ensure safety.
Richard Gingras: I once listened to a science radio podcast about the whole notion of predatory risks, explaining why you don’t sleep as well while you’re traveling in hotels because all the cues in your environment say it’s unfamiliar and could be dangerous. It’s innate and understandable, but if we want to be advanced human beings we have to find a way to temper that. We have to learn how to tell ourselves, “I’m just in a hotel and hearing New York street sounds that I never hear in Los Altos”.
Alexander Görlach: You were stressing the freedom of expression and I find this quite compelling. Of course, a lot of people now are currently discussing the idea of politically correct terminology. It’s important in universities especially to make sure there isn’t language that’s insulting or derogatory.
Richard Gingras: With ‘political correctness’, where did the term even come from? It probably came from people who disagreed with its core principles. There’s an understandable core principle and norm that asks, “How do we treat others from different cultures and genders?”
Some say there are hard rules about what to say, rather we should be addressing how to think. How do we all become part of a solution? It’s a nuance between telling people what to do and going back to core principles about how societies work.
Thank you Richard Gingras!