The honey-trap of love of your social bubble is a deadly pandemic
Humanity is like a bee swarm. When you distort the perception of reality of a critical number of bees, the whole swarm dies.
When, at the very early times of Facebook’s founding, the Russians put billions into its development, inexperienced Americans barely noticed. So, what? Yuri Milner is a billionaire like everyone else. What they didn’t know was his money, like money of any billionaire in Russia, could only be used with the blessing of the state. Louis the 16th and Putin would probably understand. Milner was kindly lent by Russian giant Gazprom and state VTB bank, aka “Putin’s piggy bank”.
Milner has also purchased shares in Zynga, Stripe, Twitter , Flipkart , Practo, Spotify, Zocdoc, Groupon, JD.com, Xiaomi, OlaCabs, Alibaba, Airbnb, WhatsApp, Nubank, Wish, and later many others. Influence is everything, there is no space for a bloody coincidence.
As time went on, Sheryl Sandberg, daughter of Russian immigrants from Florida and a Harvard alumna, was hired on Facebook as the leading business person. She invented Facebook’s entire business model, which is based on the unethical mining of our psychological data and their predictions using artificial intelligence. Some of its products (Facebook has gradually swallowed every strong competition as early as possible) are used daily by three and a half billion people on the planet. Tens of millions of behavioral predictions take place every second on Facebook, determining which psychological class we belong to in the database and how we are likely to behave in the next few minutes of our lives. These predictions are supposedly used to better map and predict our purchasing behavior. It became a public secret that this knowledge primarily gives the owner and Facebook´s clients the greatest power in history of mankind.
Facebook has become a hegemon of the behavior of our human swarm. We are entrapped. It can not only predict our behavioral triggers of decision-making, but also influence it according to individual psychological groups. The dystopia we experience today “biggly trumped” the scariest parts of the Book of Revelation. We became prisoners of the honey-trap of our social value.
The British-American-Russian-Saudi-Iran subcontractor Cambridge Analytica was the first to convince us that Facebook and the chosen people who have access to the databases it has accumulated (in fact, the 21st century enriched uranium) can manipulate human fear and anger to cause short-circuited behavior in elections or for political or military purposes. It used years of the top notch behavioral research by Cambridge, UK, to apply them at scale. The company could provide experiments on the population, whose behavior she then played with (“We can find your hidden demons”, bragged CA Director Alexander Nix at a presentation after Donald Trump’s election win).
The recent leak of internal conversation between teams and Facebook managers revealed what analysts have been whispering on Twitter so far. Facebook knew about every human and political tragedy that its algorithms of distributed information, as well as disinformation, caused. But they did nothing because it would hurt the business. The company knew that slaves were trafficked through its platform. Facebook was an important tool of the Rohingya genocide in Myanmar. The Big Brother also knew that his Instagram app was behind a frightening rise in depression among young teenage girls. They knew that its algorithms were deliberately tweaked to cause a real addiction on its services. They knew FB is the world’s dominant spreader of misinformation about Covid and vaccines. And the giant didn’t do anything but fire those who brought it to the attention.
Facebook is not the single perpetrator of the most serious epistemic crisis in history, but it is the most influential one. If we don’t understand that the distortion of our information space has caused the loss of our collective ability to agree on an interpretation of reality, we will perish like that bee swarm. The clock is ticking.