Testing Turing – Summaries

A critique of linguistic capitalism (and an artistic intervention)
Pip Thornton (Royal Holloway University of London, UK)

In an age of ubiquitous digital technology and information exchange, the selling of words has never been more lucrative. Digitised words are capable of carrying far more than linguistic meaning, and as such are valuable commodities in the advertising marketplace. Nobody knows this better than Google, which made its fortune from the auctioning of words through Adwords; a form of ‘linguistic capitalism’ (Kaplan, 2014) in which the contextual or linguistic value of language is negated at the expense of its exchange value. But what are the residual cultural or political effects of this algorithmic exploitation of language? As the linguistic data we create and upload is tailored to court search algorithms, and keywords take on referential values unanchored to narrative context, digitised language has perhaps reached peak performativity (Lyotard, 1984);linguistic input narrowed and restricted to achieve maximum financial output. This talk explores what happens to creativity in language when its passage through digital space is necessarily directed through certain obligatory passage points. But more than that, the talk is also an attempt to make the side-effects of linguistic capitalism visible through artistic intervention, exposing the politics lurking within the algorithmic hierarchies and logic of the search engine industry. I will therefore be explaining and demonstrating my own attempt to reverse this performative logic of production in the form of a research/art project called {poem}.py which I hope goes some way in rescuing language from the clutches of the market; re-politicising it (Benjamin, 1999), and reclaiming it for art.

Kaplan, F. (2014). Linguistic capitalism and algorithmic mediation. Representations, 127(1), 57-63. Lyotard, J. F. (1984). The postmodern condition: A report on knowledge (Vol. 10). U of Minnesota Press. Benjamin, W. (1999). ‘The Work of Art in the Age of Mechanical Reproduction’. Illuminations. Pimlico Press.

Biography
Pip Thornton is PhD candidate in the Centre for Doctoral Training in Cybersecurity at Royal Holloway. With a professional background in the police and the military, she has BA degrees in History & Politics and English Literature, and a Masters in English from King’s College London. She is currently completing her thesis titled ‘Language in the Age of Algorithmic Reproduction’. @Pip__T

 

Imitating Ada Lovelace: creative computing and experimental humanities
Pip Willcox (Oxford Bodleian Libraries, UK)

Suppose Babbage had built his Analytical Engine, and Lovelace had lived beyond the age of 36, how might Lovelace have developed her ideas about the machine as partner in the creative process?

In the 200 years since Ada Lovelace’s birth, she has been celebrated, neglected, and taken up as a symbol for a number of causes. This talk traces some of paths that ideas of Lovelace have taken, and revisits her imagination of Charles Babbage’s proposed but never constructed steam-powered general purpose computer, the Analytical Engine. Her imaginative engagement with science, and especially mathematics, led her to a deep understanding of the Engine, including her widely cited description that “the engine might compose elaborate and scientific pieces of music of any degree of complexity or extent”.

This claim, alongside composer Emily Howard’s operatic ‘Ada sketches’, are the inspiration for a series of experiments we conducted into the process of co-creativity with machines, producing music from maths. Using the mathematics of Lovelace’s time and a software simulation of the Engine, our experiments included creating a web application, Numbers into Notes, and constructing an ‘arduino orchestra’ to simulate multiple, connected Analytical Engines.

This talk recounts some of these experiments that play at once into generative design and into alternative histories of algorithms and mechanisms. Through making, through prototyping and co-design, we close-read and follow the thought processes Lovelace and Babbage recorded, extending beyond what was practicable in the nineteenth century. Our exploration also touches on creativity, as anticipated by Lovelace, examined by Turing, and recast in Boden’s “Lovelace questions”, and manifest today in the fields of computational creativity and creative computing.

Biography
Pip Willcox is the Head of the Centre for Digital Scholarship at the Bodleian Libraries, University of Oxford, and a Senior Researcher at the Oxford E-Research Centre. Between these two departments, their work includes projects such as Fusing Audio and Semantic Technologies for Intelligent Music Production and Consumption and SOCIAM: The Theory and Practice of Social Machine. They also co-run the multidisciplinary TORCH Critical Visualization network. @pipwillcox

 

On intuition: machine learning and posthuman ethics
Louise Amoore (University of Durham, UK)

Two images are juxtaposed on the screen. The images are data visualizations of the extracted features of the movements of surgeons’ hands on the console of the da Vinci surgical robot. The da Vinci robot (by Intuitive Surgical Inc.) promises the augmented vision and precise angles of movement of an enhanced composite being of surgeon and machine. This composite being is made possible by a series of machine learning algorithms acting on a cloud database. What form of learning takes place in the neural net algorithms that couple the surgeon to her da Vinci? As the hand and eye movements of the surgeon are tracked via the API of the robot, the future cognitive processes of the robot are ‘optimized’ and the future actions of surgeons are modified. Intuition belongs to a posthuman composite that has agency and acts upon future states of being.

Whether in the da Vinci surgical robots, or in the geopolitics of automated weaponry, drones, and intelligence gathering, machine learning algorithms and operatives are trained for future action via the patterns of ingested past data. What kind of ethics is possible in the context of the intuitive learning of a posthuman composite? Can this form of cognition and action be meaningfully called to account?

Biography
Louise Amoore researches and teaches in the areas of global geopolitics and security. She has particular interests in how contemporary forms of data, analytics and risk management are changing the techniques of border control and security. Louise has been awarded a Leverhulme Major Research Fellowship (2016-18) for work on the Ethics of Algorithm. @AmooreLouise