No one today would dare deny that the rapid spread of fake news and hate speech on the World Wide Web is a scourge in need of resolute counter-strategies. The leaders of Facebook, Google, Twitter and other online services and social media platforms are at the forefront of this battle. They have already given proof of a proactive stance by creating technology and services to combat the problem. But there is another phenomenon that is taking on the character of an entirely analogous emergency, one that is neither receiving the same interest from the media nor by far similar commitment from the Web’s giants to eradicate it. The reason for this is simple: the phenomenon we speak of is the fundamental ingredient making the Web economy so marvellously successful up until now: our addiction to Internet and smartphone, the growing difficulty in tearing ourselves away from the devices that are making us easier to manipulate, less inclined to use the Web in accordance with our own objectives, and definitely less free.
To capture our attention and guide it, the most elaborate strategies are put into play, to the extent that we can call it “the engineering of addiction
The evolution – of the instruments we use daily – toward a model of this type was not foreseen by their own creators, although we must mention that Mark Zuckerberg, the head of Facebook, is no stranger to how the human mind works, being specialised in psychology and informatics at Harvard. Rather it is the extreme result of a business model based on the “attention economy”. Contrary to what happens with traditional media like radio, TV and newspapers, the Web does not solicit the attention of an undifferentiated group, but of each single viewer, whose personal tastes and preferences are carefully recorded and used in order to propose advertising, products and services tailored to individual interests.
On the Web’s enormous market, attention is in short supply. It is vied for by myriads of competing proposals, making it extremely precious. To capture our attention and guide it, the most elaborate strategies are put into play, to the extent that we can call it “the engineering of addiction”, where programmes are deliberately devised to entice us to use technological instruments (above all, smartphones) for an ever-increasing amount of time. Our relationship with our devices is now based on automatism: we check our cell phone an average 150 times per day and connect to Facebook without even realising it, or when we intended to do something entirely different.
The problem of Internet dependency has long roots that go back in time. Since the beginning of the Web, our participation has deeply involved our structures of thinking and perception of the world. The first to recognise Internet addiction was the American psychologist Kimberley Young, who in 1998 wrote Caught in the Net: How to Recognise the Signs of Internet Addiction and a Winning Strategy for Recovery, based on the results of a three-year study. At the time, Zuckerberg was still at middle school, Wi-Fi did not exist, the Web had yet to become a mass phenomenon, and smartphones had not yet been imagined by far. Yet in that first study, three characteristics of potentially addictive behaviour in our relation to the Web came to the foreground. The first is the ability to favour virtual escape into cyberspace. Here, there is a comparison with drugs that has always been part of technology’s history. (For instance, the 23 January 23, 1990 edition of The Wall Street Journal published a lengthy, well-researched piece by the WSJ staff reporter G. Pascal Zachary describing virtual reality as “electronic LSD”.) The second characteristic is the enormous power of connecting with other people, a connection that can be taken up and broken off like a round of games on a gigantic pinball machine. The third is the possibility of playing infinitely with one’s online identity, which can generate an unhealthy sense of omnipotence. In the same years, B.J. Fogg, the director of the Stanford Persuasive Technology Lab, was focusing on “methods for creating habits, showing what causes behaviour, automating behaviour change and persuading people via mobile phones”.
Multitudes of engineers wake up every morning with the sole aim of keeping us scrolling and clicking, thus augmenting the probability that a form of addiction will develop inside us.
Today the course in this direction has become heightened. Some people have even decided to denounce what is happening. The most recent (first week of November 2017) is Sean Parker, who served as the first president of Facebook. He has become a “conscientious objector” to social media, which he says have been built to addict us all. One month before, on 6 October, The Guardian published a documented item titled Our minds can be hijacked: the tech insiders who fear a smartphone dystopia. The reporter interviewed the engineers and programmers who invented the “like” button (they include the Facebook engineer Justin Rosenstein) or the pull-to-refresh mechanism, whereby users swipe down, pause and wait to see what content appears (created by Loren Brichter in 2009 for his start-up and soon acquired by Twitter). They and other tech insiders are worried about the consequences of their own inventions and explain their strategies for limiting their use of them.
The main lines along which the “hijacking” occurs are known. Adam Alter, who teaches marketing and psychology at the New York University, published the book Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked earlier this year. He explains that the first mechanism involved is social feedback, substantiated as it is through “likes” for a posted photograph or text, comments, shares and retweets. To receive feedback makes us feel good. It has been proven that the same brain circuits are stimulated as the ones activated by gambling or drinking alcohol. The release of dopamine creates the desire to recreate the experience as often as possible. Dependency on social feedback is enhanced by its unpredictability. We neither know when a “like” will arrive, nor how many of them we will receive, making us check back with increasing frequency.
The Wall Street Journal published a lengthy piece by the WSJ staff reporter G. Pascal Zachary describing virtual reality as electronic LSD
Nothing was left to chance in the design of the screens we see on our smartphones. Multitudes of engineers wake up every morning with the sole aim of keeping us scrolling and clicking, thus augmenting the probability that a form of addiction will develop inside us. The mechanism of notices is a good example. Until 2016, Tristan Harris was a design ethicist and product philosopher with Google (his job was to direct users online to a particular website or service). Now he is a co-founder of Time Well Spent, a non-profit movement to align technology with our humanity. A friend at Facebook told Harris that designers initially decided the notification icon, which alerts people to new activity such as “friend requests” or “likes”, should be blue. It fit Facebook’s style and, the thinking went, would appear “subtle and innocuous”. “But no one used it,” Harris says. “Then they switched it to red and of course everyone used it.” Red transmits to the brain a signal of alarm, something that must be tended to immediately. In the same way, the presentation of updates offered by Instagram is designed to make us think we are missing out, so we need to remediate right away. The automatic play of videos makes it more difficult to stop watching, and this is part of the same strategy. Click after click, the risk of developing a behavioural addiction grows. For now, there is no regulation or limit that applies to these techniques. If the Web is swarming with fake news and hate speech, that is a big problem for people selling advertisements, but millions of users unable to stop staring at their screens represent no problem at all.
© All rights reserved