Sunday 23 June 2024

TAMPERING WITH REALITY

Dick Pountain /The Political Quarterly/ 15 Mar 2024 03:34

The Eye of the Master: A Social History of Artificial Intelligence by Matteo Pasquinelli; Verso 2023; pp264; £16.99

It feels as though a storm is gathering around Artificial Intelligence (AI), since just about everyone believes that it’s set to change our world but in what direction is a matter of great controversy: will it be a mob of peasants storming Frankenstein’s castle with pitchforks and burning brands, or eager prospectors with mules, spades and pans scurrying to make their fortune in the California hills. 

Matteo Pasquinelli, an Associate Professor in Philosophy of Science in Venice, has written an excellent social history of AI in which he treats these technologies as the most recent stage in the historical process of the Division of Labour: “In the industrial age, the supervision of the division of labour used to be the task of the factory’s master” and AI is just another, more powerful technology for measuring, organising, spying-on and controlling workplace and workers, the latest phase in the exploitation of Labour by Capital. Pasquinelli is no techno-utopian.

Pasquinelli’s book covers an ambitious span of historical time, which he found necessary to identify the origins of what he calls ‘algorithmic thinking’ ie. the creation of rules for solving problems via a sequence of discrete steps. His introductory chapter, ‘The Material Tools Of Algorithmic Thinking’, locates the earliest recorded examples of such thinking in the Vedic mythology of India around 800 BCE. Those rituals involved building ‘Fire Altars’ whose design was prescribed by stepwise methods intended to reassemble the fragmented body of the god Prajapati (but they also served to teach a system of geometry useful in building). From that beginning a path winds through the Babylonian creation of counting, and hence accounting, on clay tablets, through the Greek geometers, the Arabic algebraicists and finally to the Europe of Pascal, Leibniz, Newton and Descartes.    

Pasquinelli’s book is organised into four sections, the second of which, ‘The Industrial Age’, contains four chapters beginning with one devoted to Charles Babbage, to whom he attributes the mechanisation of mental labour. In 1822 during the UK’s industrial revolution Babbage designed his Difference Engine, a machine to automate numeric calculation. It would employ then state-of-the-art technologies like metal cogs and steam power to generate tables of logarithms – needed by astronomers and the military – more quickly and cheaply than hand calculation. Babbage only ever produced a single prototype but his efforts in effect inaugurated modern computer science, as well as influencing Karl Marx toward a labour theory of the machine. Babbage saw that all machines imitate and replace some previous division of labour, and that a calculating machine in particular automates the derivation of labour costs. In three following chapters Pasquinelli traces the effect of Babbage’s Principle in Marx’s idea of the ‘general intellect’ and a labour theory of knowledge, at a depth which may prove a slog for all but professional scholars of Marxism. 

A third section, ‘The Information Age’, arrives at what we’d now recognise as Artificial Intelligence where Pasquinelli excels in his choice of crucial technological pivot-points and personalities, as well as in explaining the complex ideas involved. He zips through Alan Turing, the invention of the digital electronic computer during WWII, the rise of cybernetics and interest in ‘self-organising’ systems after the war, and the work of John von Neumann and Donald Hebb aimed at emulating the human nervous system via electronics. An early practical goal was pattern recognition, needed both for reading text and for industrial control, and its pursuit created a split between those who sought to use symbolic mathematics and those who pursued solutions by statistical induction from large amounts of sample data. 

The most intriguing chapter is ‘Hayek and the Epistemology of Connectionism’ which reveals that Friedrich Hayek, co-founder of neoliberal economics, put forward a theory of human cognition which depicted our nervous system as an ‘instrument of classification’ in accordance with the ideas of Hebb’s new ‘connectionist’ school of AI. This is seldom remembered because Hayek didn’t wish for it to be implemented in hardware, believing that it already existed in the operation of markets, as aggregators of knowledge about prices. Here Pasquinelli touches upon the late-1940s debate about ‘socialist calculation’, initiated by Ludwig von Mises who argued that economic planning would prove impossible under a socialist bureaucracy due to lack of commodity prices as units of account. He was opposed by the Marxist economist Oskar Lange who later in the 1960s proposed the use of increasingly powerful computers in socialist planning. 

Pasquinelli follows the split between the ‘symbolic’ and ‘connectionist’ AI schools into a final chapter on Frank Rosenblatt’s 1957 construction of his Perceptron, the first proper artificial neural network of a kind that points toward today’s ‘deep learning’ machines. Using a camera with a 20x20 pixel grid, the Perceptron could recognise simple patterns like alphabetic characters, and the thrust of Pasquinelli’s  argument is completed by its arrival as the route to today’s functioning AI becomes visible: emulated human neurons connected into ever-wider and deeper networks and trained on huge amounts of real world data. That route spanned another 50-year ‘AI winter’ of tepid results, hampered by lack of computer processing power and training data – AI research only started sprinting again in the late 1990s with the advent of LSI (Large Scale Integration) silicon microprocessors and the Internet. 

Google LLC was founded in 1998 and quickly monopolised internet search and advertising, in the process amassing huge quantities of users’ data for free which it deployed in AI research to automate natural language translation. Pasquinelli has of necessity to skim lightly (but accurately) over this period as it would require a whole second volume to describe in detail the way these developments lead to ‘deep learning’ algorithms, pre-trained generative transformers (GPTs) and a sudden blossoming of AI power after 2020 which surprised even its own inventors, and to some extent still does. 

In his concluding chapter ‘The Automation of General Intelligence’ Pasquinelli confronts the politics of AI directly: its monopolistic ownership, and also its extractive nature which ‘scrapes’ (ie. steals) a whole corpus of human culture from the internet for training data, without payment, along with racial and other biases that creep in as a result. His own position appears broadly aligned with the Italian ‘operaismo’ movement of the Hardt and Negri strain and he never deviates from a hostile view of AI as the ultimate tool of control and surveillance over labour: “The first step of technopolitics is not technological but political. It is about emancipating and decolonising, when not abolishing as a whole, the organisation of labour and social relations on which complex technical systems, industrial robots, and social algorithms are based”.

My own inclination is more ‘culturalist’ than Pasquellini’s, forcing me to wonder whether the positive powers of AI couldn’t be tamed and harnessed to help implement schemes of participatory socialism of the sort imagined by Eduard Bernstein, Thomas Piketty and many others. Then I reflect further on the threat that deepfaked images pose to truth, privacy, democracy and even to personal identity, and wonder whether perhaps pitchforks and burning brands might not be such a bad thing after all…


 


















TAMPERING WITH REALITY

Dick Pountain /The Political Quarterly/ 15 Mar 2024 03:34 The Eye of the Master: A Social History of Artificial Intelligence by Matteo Pasqu...