How the algorithm boosted our destructive potential
When historian Robert Proctor from Stanford University pulled a dusty study of the tobacco industry’s first Public Relations campaign out of the archive, he had no idea that what he was holding was actually a bomb tutorial.
In short, the tobacco industry faced the possibility of total bankruptcy in 1953. Scientific work proved beyond a reasonable doubt that smoking causes lung cancer. In a last desperate attempt, the representatives of the industry hired a young unorthodox “Public Relations” guy named John Hill. Hill delivered a plan that consisted of creating so many versions and facets of the truth about what else can cause cancer that after one or more years people lost the attention and interest. When “smoking kills” was debated for several years, everyone went like: “Cigarettes kill, so what?” Doubt is human. Ignorance is human. Dismissing, projecting, compensation or downplaying is human too. Hill launched a campaign where popular television personalities “doubted” and “mocked” the scientific conclusions. Nobody said it was a lie. Doubt and distraction are enough to keep people from “insisting on science.”
Sounds familiar? Any bells ringing?
Public relations and marketing methods were constantly improving until they came across so-called behavioural science, the part of psychology that examines what stimuli affect human behaviour and decision-making. Behavioural science quickly became the darling of all marketers, providing insight into the dark side of the human brain and soul, where often everything but “critical thinking” takes place. Behavioural science teaches us that in the vast majority of cases we do not have the time or desire to go through the whole rational process of finding pros and cons, as critical thinking requires, but we proceed with the urge to “experience and shortcut”, that is, with a kind of heuristics. Simply said we act first and rationalize our decision afterwards. Marketers and propagandists know and use this emotionally driven “shortcut” very well.
Then social networks invaded the world of the Internet, allowing the human soul to be examined from the billions of data collected. Cambridge Analytica became the pioneer of this research, which should never have crossed the line of the business world. This company, once a British military sub-contractor, was able to use the knowledge to analyze vast amounts of our psychological data and create communication with individual groups according to our weaknesses and internal “demons” to trigger desirable electoral decisions. In 2013–2016, they tested this work all over the world, then applied it in the Brexit referendum and the 2016 US elections. For private money invested by the wealthy sponsor of the extreme wing of the Republican Party, Robert Mercer. He wanted more influence and had been ignored by powerful Republicans before. Cambridge Analytica used a new ideology of “populism” invented by Steven Bannon to replace the classic conservative rhetoric of traditional centrist Republicans as a “stuffing”. With new rhetoric came new values. Some sources say, that to describe the easy manipulation of the American soul, Bannon said: “American religion? Whiteness, gun rights and abortions, it’s not deeper than that.“ However, I can't prove this quote, but the content of their propaganda machine was exactly this.
Because Facebook, Twitter, as well as Google and others are the world’s largest and dominant distributors of information, the crackdown on modern propaganda has changed. The disinformation and manipulative content used to take years to change our behaviour. Today they need just hours or days. Moreover, there is something called algorithms, i.e. mathematical models of information distribution on online platforms. Programmers have adapted them to preferably disseminate information that evokes some emotions — outrage, anger, hatred, but also joy or laughter. This “short circuit” is the main source of earnings of social networks. We are fed by the information we like, not by the information we might need to protect our health, communities or our democracy. The window to the world had shut down and instead, we have been surrounded by mirrors. How do I look? What do people think about me? Do I play my role well, does it bring a social benefit? Who else is echoing my opinions?
Thanks to our desire to be drawn into emotional games, Facebook is now one of the most powerful companies in the world, deciding whose information we see and whose information we don’t. But what about us? What happened to the values through which we learned to recognize the truth? The values that every society embraces in the Constitution and unwritten ethical laws? Is it even possible for us as humans to survive only on information that we like without being exposed to the information we need? Technology caught us by surprise. It is time to start worrying about our democratic order, culture and traditional values. And by that, I really don’t mean colour, abortion ban or the second amendment. I mean the truth, science, respect for one’s neighbour, cohesion, and mutual help. Those values disappear in every prologue to the war. It´s time to stop. We all deserve a fair information environment. Let´s break the algorithmic mirror and look out of the window again.