Is the Internet making us stupid?

How positive is the computer's impact on our knowledge and thinking patterns? The most recent books on the matter take a step back from the internet and invite rebellion.

Professor Joseph Weizenbaum had his first suspicions when he saw his secretary crouch down waiting for him to leave his desk and then leap at the computer keyboard and embark on a frenetic dialogue with ELIZA – only ELIZA was not her best friend but a computer program invented by Weizenbaum in 1964 to test the machine's ability to recognise natural language. The software conversed with users, asking questions and giving answers in the style of a Rogerian psychoanalyst. It even became a sort of confidant for many and to such a degree that some psychologists suggested adopting it to cover hospital-staff shortages. While the debate on artificial intelligence raged and computer experts were busy studying systems that would teach machines to think like humans, Weizenbaum dissociated himself from the distorted applications of his creature and wrote an essay ("Computer Power and Human reason") predicting that computers would quickly spread to a wide range of contexts in our lives, which would be transformed – not always for the better – by its presence.

The story of ELIZA is told by American journalist Nicholas Carr, who in his recent Is the Internet making us stupid? confirms Weizenbaum's worrying prophesies and confirms – backed up by neuroscience data – the irreversible changes that the use of computers and the Internet have already made to the way our minds work. If technological tools really are not neutral but alter our knowledge and thinking patterns deep down, as predicted by Marshall McLuhan, then the impact of the computer is far greater than that of all its predecessors – from the plough to the TV– because it does not simply extend the capacity of our senses, it simulates mental activity. Inevitably, it implicitly imposes its models on us via the use of certain programs that have an imperceptible influence on the way we develop our thoughts. The most obvious example is PowerPoint, the ubiquitous presentation software that has irritatingly come to standardise the style of reports. Creativity is no longer inventing something original, it choosing from a 'menu' of existing options, in exactly the same way as we have to reduce our complex personas to a scanty set of details to insert in the fields of a database when describing who we are on Facebook. We are reminded of this by Jaron Lanier, the inventor of virtual reality, a brilliant programmer and today a fierce critic of the excesses of Internet use, in his You are not a Gadget. (Alfred A. Knopf).

This gradual adaptation of our thought to computer-imposed patterns is not pain-free; linear reading is a first and illustrious victim. "Immersing myself in a book or a lengthy article used to be easy... – writes Nicholas Carr – I'd spend hours strolling through long stretches of prose. That's rarely the case anymore.... What the Net seems to be doing is chipping away my capacity for concentration and contemplation." Because – and each and every one of us could say the same thing – when you access the Internet, you find yourself right in the middle of an "ecosystem of interruption technologies", to use the words of writer Cory Doctorow, where every centimetre of screen has to attract our attention and invites us to click on and chase new links. Our movement between pieces of information is no longer in-depth immersion but like surfing on the sea's surface. Alessandro Baricco had long guessed this and last year expanded on his book I barbari ("The Barbarians"), published in 2006, with a long article for Wired that focused on Google and the profound and irreversible changes its use was already bringing. "The value of information on the Internet is based on the number of sites that steer you to it and, therefore, the speed with which it is found."

But far from only translating into an extraordinary demonstration of technical efficiency, this cornerstone of a search engine's function has significant repercussions on what we mean by knowledge. We should remember that the declared aim of Larry Page and Sergej Brin, the inventors and owners of Google, is to create a sophisticated form of artificial intelligence that will keep getting better at understanding what we are searching for and offer it to us as quickly as possible. However, Google does not really care whether we dwell on what we have found. On the contrary, its advertising revenue rises when we keep clicking, constantly opening new links. So, Carr argues, it is in Google's "economic interest to drive us to distraction". Or, to use Baricco's words, its use gives us the "instinctive conviction that the essence of things is not in a point but in a trajectory, not hidden deep down but scattered over the surface; it does not live inside things but slithers out of them." When we explore the huge spider's web that is the Internet, it is the links that matter, not so much remembering single pieces of information but reconstructing the route towards them. Everything is there on the Web, we simply have to find the right link. So, our mental efforts are increasingly concentrated on this activity and our thought process inevitably changes, "a thought that no longer asks about the causes but about the correlations", as Frank Schirrmacher points out in Payback.

We have already borrowed a number of procedures from computers that sift and manage information and we are also dangerously delegating the capacity to separate what is important from what is not to the machine, just as Google does with its website lists. "We love certainty and the greater the certainty the greater our sense of control. It is our way of managing risks so we develop routines that resemble those used by computers", writes Schirrmacher. It is this desire for certainty that throws us into the arms of the computer, no longer striving to simulate our intelligence but trying to become like them, with thought that does not think but 'calculates'.

So, what do we do? The German expert suggests a dogged defence of all the territory that machines are unable to access, that of uncertainty and imprecision, and that we never forget that "no computer can operate with as low a degree of accuracy as that of the human brain". It is no coincidence that sites often ask us to read lettering that is intentionally blurred and hard to read to differentiate us from software, so called 'captchas' (completely automated public Turing test to tell computers and humans apart). Accept error and manage imperfection. This is something that we really cannot ask of a computer.

Stefania Garassini lives in Milan. She is a journalist, expert in new media

Latest on Opinion

Latest on Domus

Read more
China Germany India Mexico, Central America and Caribbean Sri Lanka Korea icon-camera close icon-comments icon-down-sm icon-download icon-facebook icon-heart icon-heart icon-next-sm icon-next icon-pinterest icon-play icon-plus icon-prev-sm icon-prev Search icon-twitter icon-views icon-instagram