by Jan-Felix Schneider
Upbeat, genius scales, groovy rhythms – I really love this jazz band from Brooklyn. I want to share this with my friends – so I like them on Facebook. But how much have I just revealed about myself? Only that I like that band? Since Cambridge Analytica, we know that I revealed much more. Our like profile reveals our personality, sexual preferences, and socioeconomic status. Even though recent scandals as well as research findings have made this evident to a wider public, many people are still agnostic to privacy. But the data we leave when we shop, click, chat, browse, and binge, not only reveals information about ourselves, but also about our relationships to other people. New algorithms can infer and exploit these relationships for even more effective manipulation, which erodes our democratic principle of independent voter preferences. To deal with these new manipulation potential, we need to develop new democratic institutions and cannot treat privacy as a ancillary responsibility.
One of the most important uses of big data is manipulation, or as other may call it: advertisement and marketing. Many people have written how this affects our individual experience online and how more data about someone does create more appealing products and digital services. While I also love my individual experience on Netflix and YouTube, as well as that personalization algorithms filters irrelevant items for me, the knowledge about my preferences and emotional triggers can also be used for political targeting. It can be used to discouraging voting, tailoring campaign messages in order to appeal to single topic voters, or sabotaging political debate. This can be done through online advertising which appeals to vulnerabilities and emotions of the receiver, or even through real world interventions such as targeted door-to-door campaigning.
To deal with these new manipulation potential, we need to develop new democratic institutions and cannot treat privacy as a ancillary responsibility.
Although people spend more and more time in the online world and online manipulation is becoming more ubiquitous, a major source of our behavior and opinions are still our friends, families, and colleagues. In a famous experiment, psychologist Solomon Asch presented people with questions that had a very obvious answer, i.e. of two lines, which one is longer. He then observed that people would generally give wrong answers if enough people in the room would also give wrong answers (which Asch instructed to do beforehand). This conformity experiment shows how strong the influence of others can be to our own behavior and opinions.
With modern big data analysis, one can exploit this peer influence quite effectively for manipulation, as we are not only revealing personal information but also information about our relationships. Specifically, one can use personal data to create influence maps. Those maps shows how likely one person, who shows a certain behavior, passes on this behavior to another person. The difficulty to derive these maps is that we often only know when a person shows a certain behavior (likes a page on Facebook, watches a video on YouTube, or purchases a product on Amazon), but not who influenced that person. With some statistical modeling, we can derive relationships between people from the behavioral sequences. For example, if one person always watches a series on Netflix and another watches the same series a week later, the one person might have told the other person about the series. While this is only one sequence, if we intersect it with sequences on new workout classes, recently published books, or purchases of new food items, one can make this inference about relationships more robust. Additionally one can take into account data about where a person lives, who the person is communicating to, and whether a behavior differs from the person’s routine – at large internet companies most of that data is present.
These influence maps elevate manipulation to a new, severe level. First, with influence maps one can detect non-trivial influence dynamics and leverage them for personalized interventions. For example, one could identify influential people, i.e. people who act as role models for a large number of people and optimize their promoter score, in order to spread advertisement or information quickly to a larger audience. Additionally, one can identify so-called social clusters. These are densely connected people, such as a group of friends. Because of conformity these clusters are usually fairly stable in their behavior – however, if a number of people in that cluster adapt a new behavior, they draw the whole cluster with them and even nearby clusters if they are well connected to them. This makes clusters even more powerful than influential nodes and one can focus targeted intervention on this group, when unfocused interventions might be ineffective – this has been called the “threshold effect” when a few small factors on the fringe, in this case a few more convinced people, can make all the difference.
Because this manipulative power not only affects the people who share their information, but also those who do not, privacy is not a personal choice but a civic duty.
Another reason why these influence maps are so powerful is that influence through peers is much less transparent and more difficult to withstand. While some people are unaffected even by personalized advertisements, their influential friends could still be susceptible, and thus cause them to be manipulated indirectly. Maybe I am suspicious to this new diet promoted to me on Facebook – but if a good friend tells me how amazing it is, I might believe him or her. This kind of manipulation is much more difficult to withstand, because of our social nature and because of this lack of transparency. Manipulation works best, if we do not think we are manipulated and have let our mental guards down.
This new potential of social manipulation touches the core of our democratic society: the principle that the voting preferences are independent and built on individual opinions formed from non-manipulated information. For our understanding of democracy, these preferences will then determine democratic representation and policy decisions. But what if, this direction of influence is reverted? If the people in power are able to manipulate these preferences in a substantial way, can we still assume that elections are an expression of the democratic will?
Manipulation works best, if we do not think we are manipulated and have let our mental guards down.
One way to avoid this problem is to reduce the manipulative power of big data and influence maps. The main fuel for influence maps are the data they are using for inference. This includes data that might seem harmless at first, but powerful information can be inferred from it. Therefore, protecting our data, avoiding too much concentration of personal information, and making the use of personal data transparent, is an effective way to constrain manipulation. Because this manipulative power not only affects the people who share their information, but also those who do not, privacy is not a personal choice but a civic duty. As part of this duty, we should also be aware about the power of our surroundings and use this knowledge to cultivate our own social network. The problem of indirect manipulation through peer pressure could be alleviated if people are aware about the people who influence them and make conscious choices about who to engage with.
While privacy is one option, we could also embrace influence maps, but in a democratic legitimized way. Some may argue that influence maps can also be very “efficient” and useful for the public. China experienced several positive outcomes such as less deadly traffic accidents when introduced the social credit score and gained popular support for it. Influence maps could also be used for the public good. For example, they be used to promote eating less sugar and moving more, or promote using the subway instead of an Uber ride and instill other behavior that is beneficial for society. However, it is important that these manipulations are legitimated by a democratic process, and not – like in China – misused to control its population. That’s why democracies might have to create new democratic institutions that decide on which manipulations are desired and give them democratic legitimacy.