Being Borged by Artificial Intelligence
Some comments on the data theft inherent in the current business model of AI
‘All art is plagiarism.’
‘These days, if your book is not circulating as a pirated PDF, it's failed.’
—McKenzie Wark
I found out my book, Why the Future Is Workless, is one of the many millions illegally scraped to help train ChatGPT, and I had a bit of an Oscar Wilde reaction, I’m afraid. The only thing worse than being scraped by a bot to train an AI app is to not be scraped by a bot to train an AI app.
I’m being a bit flippant about this, but I do feel some reactions to these revelations about data scraping are a bit, well, precious, and in fact, this is something I have been talking about since, as it happens, the very book of mine that was scraped.
In Workless, and at greater length in The Future of Everything, I argue for the need to reconfigure the relationship between the owners of these digital technologies and the people who produce the data that they monetise. In short, we all should be paid for our data. My preferred way of doing this is via a basic income that everyone receives, but the key point is this:
We are all working for these tech companies for free by providing our data to them in a way that allows them to hide our contribution while benefiting immensely from it. It is way past time that we were paid for this hidden labour, potentially using that income to offset reductions in our formal working hours.
These latest concerns about the “illegal” use of people’s data have a different complexion because they are focussing on works of fiction appropriated by the Silicon Valley Borg, and already a few court cases are under way to protect author copyright. Writers like Richard Flanagan have voiced concerns as to why they feel violated by this data scraping, with Flanagan saying that “I felt as if my soul had been strip mined and I was powerless to stop it.”
It is good for people to explain that sense of violation if they feel it; nonetheless, I think Flanagan is on stronger legal, even moral ground when he describes what has happened as “the biggest act of copyright theft in history.”
I don’t think authors—or other artists—need surrender to this theft even when they believe that economic conditions in which they are operating should change. That may well be true, but in the meantime we all have to live in the world as we find it, and technologists don’t just get to steal our work.
So, let’s try and separate a few strands.
Creative work like writing a novel or choreographing a dance or even publishing a work of non-fiction is a form of labour, and in the debate about what is happening with apps like ChatGPT appropriating these works, I would prefer to concentrate on the labour end of the spectrum, not the creative end of it, in making the case for control and payment of work that is used in this way.
Arguing from a creative perspective gets you into the complex area of the nature of artistic creation and originality and I am of the view that all such creative work is irreducibly derivative, a social project realised via individual talents.
We are overfed on the “great man” view of history (and art), and the distortions of what we might call the US constitutional view of freedom and individuality (not to mention the capitalist idea of the market as the sole arbiter of value). In this understanding, society is a collection of individuals, each endowed with inalienable rights, whereas I think we might be better served by the perspective that argues humans are irreducibly social and that individuality arises from that sociality so that it is that we need to protect in order to establish individual freedom.
Art, then, is created, not in the enactment of individualist originality, but in the shared limitations of being human. What we think of as “great art” achieves its greatness, its beauty, even its transcendence in the limitations of the human body and it is those limitations—physical and mental—that are the stuff from which meaning emerges.
Just as games require rules, art requires these embodied limitations, and we draw on our shared experience of the world to produce it. This is not to diminish the particular abilities of individuals, or the notion of talent or even of relative worth. But it is to recognise that even that exceptionalism ultimately depends on the human limitations of the body and the body politick.
On this level, the problem with art created by artificial intelligence apps is that they lack these human limitations—particularly embodiment—and can therefore never produce, or even reproduce, works that will move us in the way that human art can. When you lack physical, mental and social limitations, nothing is at stake in the creation of art, so that, in the end, no-one will care about the art machines make.
I’m using the concept of “great art” here and I’m confident you have a clear idea of what I mean without me defining it in detail, but really, however you define it, it is a distraction in this discussion, as much a creation of capitalist work and value structures as anything that might be produced by AI.
Australian artist Ian Millis has long argued for the invisibility of the artist and the recognition of their social nature. In this definitive essay on his work, Wendy Carlson sets out his worldview and it includes this quote from Millis:
Real creative activity is so natural and unselfconscious as to be invisible. The true artist is unrecognized even by his or her self. It would be nice to say that everybody is an artist in the real sense, but given the nature of capitalist society, it is not true, although in certain circumstances or in certain other societies it may be true. In our society, almost everyone, worker or boss, leads a life of sterile alienation, but there are exceptions. They are the people who directly tackle basic problems of everyday life, and come up with simple, beautiful, workable alternatives, solutions which are radical whether analysed in sound political, economic or aesthetic terms. If we work in this way to destroy not only art, but industrial technology and formal hierarchical politics, we can create a real culture.
In this sense then, what AI lacks and will never be able to reproduce are the social relationships in which art is produced. Again, nothing is at stake when AI “makes art” absent these constraints and so there is really nothing to fear artistically from AI.
It’s like, it would be pretty easy to build a robot that runs faster than Usain Bolt, but who cares? Nothing is at stake.
Literary critic George Steiner argued in his now unfashionable book, Real Presences, that art is critical engagement with what has come before. I read the book a hundred years ago when it first came out, but this section has always stayed with me, no matter how diminished the reception of this sort of criticism might have become:
The Divine Comedy is a reading of the Aeneid, technically and spiritually 'at home', 'authorized' in the several and interactive senses of that word, as no extrinsic commentary by one who is himself not a poet can be.
The presence, visibly solicited or exorcized, of Homer, Virgil and Dante in Milton's Paradise Lost, in the epic satire of Pope and in the pilgrimage upstream of Ezra Pound's Cantos, is a 'real presence', a critique in action. Successively, each poet sets into the urgent light of his own purposes, of his own linguistic and compositional resources, the formal and substantive achievement of his predecessor(s).
…What the Aeneid rejects, alters, omits altogether from the Iliad and the Odyssey is as critically salient and instructive as that which it includes…
…Joyce's Ulysses is a critical experiencing of the Odyssey at the level of general structure, of narrative instruments and rhetorical particularity. Joyce (like Pound) reads Homer with us.
His argument speaks to the derivative, social nature of art, and there are any number of more recent examples of “plagiarism”—-here, here, here—that raise questions about originality, creativity and art, but never settle them.
Maybe we should get over it?
I’d be interested to hear more views of this, as I know many readers here are more au fait with these matters than I am.
Speaking of which, this interview with McKenzie Wark, talking about her books on the Situationists, deals with these issues, and I found much of it interesting and worthwhile. She talks about Debord’s notion of détournement, which captures the idea that “the whole of culture is a commons that belongs to everyone. That’s how it actually works in its normal state — there's no such thing as authorized statements.”
That “everyone copies and corrects.”
This is a healthy way to understand art and creativity, I think, but nonetheless, the idea of “great art” being the exclusive product of the “great artist” is hard to shake, and I wonder if what people fear from AI—even from the internet—is that technology’s ability to repurpose data of all sorts into recognisable songs, paintings, stories, essays—no matter what we think of their worth—makes us see for the first time how irreducibly social art, and other creative work, actually is?
The problem AI is presenting us with is not new, but its scale is, and that speaks to a deeper problem with the way in which the logic of late capitalism—its extractive, destructive, and dominating tendencies—is changing the nature of work in general.
In the end, all the technologies of work under capitalism are tools for disciplining workers, and that is what we should, in the first instance, being seeking to fix.
(PS: There’s some bonus content behind the paywall below, a couple other things that came up as I was writing this that wouldn’t fit in the main body of the piece but I thought were mentioning.)
Keep reading with a 7-day free trial
Subscribe to The Future of Everything to keep reading this post and get 7 days of free access to the full post archives.