AI is certainly one of the threats to musicians, producers et al at the moment, but a lot of it's bullshit on maybe a higher level than the way the written word is affected, or the visual arts.
Generative music is certainly booming, and Spotify is no doubt using it in its bullshit playlists (mood-related, say) so they don't need to pay anyone royalties.
But generative music usually comes from pre-created elements (drones, phrases), or comes out of MIDI or similar ways of getting (virtual) synths and drum machines to spit out patterns and structures.
It's just that I don't think it works in quite the same way as writing or visual art - of feeding someone's recordings or compositions in and generating more. The pixelation, weird jumps and gaps of AI video are, I think, more noticeable in music, and the Uncanny Valley will be strong for a long time yet. I don't think anyone could feed FourPlay's oeuvre into an LLM and get back anything compelling (and I know the models are very large, but I'm pretty confident the same goes for the entire corpus of classical music or blues).
To me, even other automations, like AI DJing, will miss enough of the physicality and emotional engagement of real people to work as anything more than background music. Like Spotify's lucrative playlists, that will hurt us anyway.
What gets me, though, is how willing people are to settle for "background music".
I've had a few experiences lately where someone has music playing, I ask about it, and I get a shrug. "It's just a Spotify playlist. I don't know what's on it."
So different from the sort of engagement I grew up with. The deperate need to know everything about what we were listening to.
Makes me wonder what the literary equivalent of "background music" is?
I totally agree - I think it probably works both ways. People are listening to music more in the background because Spotify facilitates and encourages that, and of course they know the dark patterns that work to push people that way. Why? Because they pay lower royalties on automatically-generated playlists.
But on the flipside, in a way there's a cohort (perhaps the greater cohort) of people who always treated music as a background thing - it was radio before, now it's more likely to be Spotify. But I wonder whether, in the decades when, to choose music, you had to purchase an album or single on vinyl, cassette or CD (and that brief period of digital purchases, predominately in iTunes, before streaming became easy), maybe that pushed a sub-cohort of those people into more deliberative listening? And now that's not necessary, they default to non-interactive mood music?
This represents a different tech-mediated change from the AI content now slipping out, but perhaps in all media (including written word, video etc) the lack of deliberation and engagement is a big driver for AI being acceptable? If you're not paying attention, do you not really care about the dumb flubs and inaccuracies?
It annoys the @@@@ out of me that the likes of Zuckerturd would pursue people for piracy etc and whining about IP breaches, then go and do the same in a wholesale secretive fashion and deny the authors of the work they're using to make billions, an income they're entitled to.
I have had conversations like this a lot of late with both other creative artists and with academics/professionals in IT, and am still formulating my thoughts on it.
When I looked at what work of mine has been "scraped" by Zuckerberg, I discovered that I had two different emotional responses, depending on the material. (I do like the word "scraped" in this context. It conveys an appropriate level of mechanistic violence.)
Learning that my novel, The Second Cure, has been plundered (another apt word) felt viscerally repulsed. As I said to another novelist, my blood turned to lava. I would feel the same way if I learned my librettos or screenplays were in the list, but it seems they're not (maybe they have been elsewhere. Who knows? ...which is in itself galling.) When Richard Flanagan says, "I felt as if my soul had been strip mined and I was powerless to stop it", I completely get it.
But I also discovered that a scientific paper on the genetics of a species of Australian lizard that I co-wrote has also been stolen. My response to that is anger, but nothing like the feeling of betrayal that I feel about the novel.
Both entailed copious labour, physical and mental. Both were original works with original ideas. But my emotional response is different, and I don't really know why.
Is it because of the types of creativity entailed? Is it because scientific knowledge exists for the common good, for everyone to benefit from, while fiction is intensely personal?
Like I say, I don't know, but I found my two different responses fascinating.
Yes, interesting and useful distinction to draw, Margaret. I have a similar feeling in that, on one level, I am glad the ideas are being "shared", circulating, in some form, to maybe influence the thoughtsphere. Maybe I'm just ratonalising.
I ultimately do feel that there is some value in AI as a way of organising data and giving people access to information and happily admit I use it for that purpose. But sheesh, the "business" model.
One of the concepts that AI can't do is 'an idea whose time has come'.
Some of the massive leaps in human understanding were built on a base of ideas that could never be recognised until the idea takes form.
AI can repeat trends and if all the information it scrapes contains some quality input instead of GIGO then it will improve but I can't see how it can 'understand' new little bits make a stronger whole. It will never get the 'eureka' moment but it will explain displacement of fluids.
I'm annoyed but not much because culture has always been about others "scraping" your work, isn't that what influence is? As an artist I have always wanted to influence people and events, I don't care if that happens via AI summarising me, with or without attribution. Surely scraping is the sincerest form of flattery, if I may scrape Oscar Wilde.
One of the comments really hits the nail on the head, because what AI does is summarise it may be able to extrapolate what might come next, but a lifetime as an artist tells me that the future never ever happens like that. The future is always about to be sideswiped by something unpredictable just as open source free socialist AI just decimated the entire US plan to use AI for world domination. I don't think the implications of that has really sunk in yet for most people.
The critical issue now is to clarify the difference between AI as a tool for sharing the commons versus AI as a tool of capitalist control and exploitation. There is hope there given that socialist models (ie like China) are clearly succeeding more than neoliberal models (ie the sinking USA)
Totally agree about "scaping", with maybe a slight proviso about acknowledgement where possible. But sometimes we do not realise what we scrape; we just live in that water.
But yeah, the point about capital exploitation v. socialisation of influences is where the real argument is for me.
I like it all - but immense thanks for the Paul Simon last words!
Such a fan!
Hm, I thought Pierre Menard was the author of The Future of Everything.
*ahem*
I like the various threads you've connected here - thanks!
Lol, thanks.
Have you been through similar things with music?
AI is certainly one of the threats to musicians, producers et al at the moment, but a lot of it's bullshit on maybe a higher level than the way the written word is affected, or the visual arts.
Generative music is certainly booming, and Spotify is no doubt using it in its bullshit playlists (mood-related, say) so they don't need to pay anyone royalties.
But generative music usually comes from pre-created elements (drones, phrases), or comes out of MIDI or similar ways of getting (virtual) synths and drum machines to spit out patterns and structures.
It's just that I don't think it works in quite the same way as writing or visual art - of feeding someone's recordings or compositions in and generating more. The pixelation, weird jumps and gaps of AI video are, I think, more noticeable in music, and the Uncanny Valley will be strong for a long time yet. I don't think anyone could feed FourPlay's oeuvre into an LLM and get back anything compelling (and I know the models are very large, but I'm pretty confident the same goes for the entire corpus of classical music or blues).
To me, even other automations, like AI DJing, will miss enough of the physicality and emotional engagement of real people to work as anything more than background music. Like Spotify's lucrative playlists, that will hurt us anyway.
That's so interesting, thanks.
What gets me, though, is how willing people are to settle for "background music".
I've had a few experiences lately where someone has music playing, I ask about it, and I get a shrug. "It's just a Spotify playlist. I don't know what's on it."
So different from the sort of engagement I grew up with. The deperate need to know everything about what we were listening to.
Makes me wonder what the literary equivalent of "background music" is?
I totally agree - I think it probably works both ways. People are listening to music more in the background because Spotify facilitates and encourages that, and of course they know the dark patterns that work to push people that way. Why? Because they pay lower royalties on automatically-generated playlists.
But on the flipside, in a way there's a cohort (perhaps the greater cohort) of people who always treated music as a background thing - it was radio before, now it's more likely to be Spotify. But I wonder whether, in the decades when, to choose music, you had to purchase an album or single on vinyl, cassette or CD (and that brief period of digital purchases, predominately in iTunes, before streaming became easy), maybe that pushed a sub-cohort of those people into more deliberative listening? And now that's not necessary, they default to non-interactive mood music?
This represents a different tech-mediated change from the AI content now slipping out, but perhaps in all media (including written word, video etc) the lack of deliberation and engagement is a big driver for AI being acceptable? If you're not paying attention, do you not really care about the dumb flubs and inaccuracies?
It annoys the @@@@ out of me that the likes of Zuckerturd would pursue people for piracy etc and whining about IP breaches, then go and do the same in a wholesale secretive fashion and deny the authors of the work they're using to make billions, an income they're entitled to.
That's the business model. Nice work if you can get it. Ugh.
Enlightening exploration, Tim.
I have had conversations like this a lot of late with both other creative artists and with academics/professionals in IT, and am still formulating my thoughts on it.
When I looked at what work of mine has been "scraped" by Zuckerberg, I discovered that I had two different emotional responses, depending on the material. (I do like the word "scraped" in this context. It conveys an appropriate level of mechanistic violence.)
Learning that my novel, The Second Cure, has been plundered (another apt word) felt viscerally repulsed. As I said to another novelist, my blood turned to lava. I would feel the same way if I learned my librettos or screenplays were in the list, but it seems they're not (maybe they have been elsewhere. Who knows? ...which is in itself galling.) When Richard Flanagan says, "I felt as if my soul had been strip mined and I was powerless to stop it", I completely get it.
But I also discovered that a scientific paper on the genetics of a species of Australian lizard that I co-wrote has also been stolen. My response to that is anger, but nothing like the feeling of betrayal that I feel about the novel.
Both entailed copious labour, physical and mental. Both were original works with original ideas. But my emotional response is different, and I don't really know why.
Is it because of the types of creativity entailed? Is it because scientific knowledge exists for the common good, for everyone to benefit from, while fiction is intensely personal?
Like I say, I don't know, but I found my two different responses fascinating.
Thanks for the thoughtful piece.
Yes, interesting and useful distinction to draw, Margaret. I have a similar feeling in that, on one level, I am glad the ideas are being "shared", circulating, in some form, to maybe influence the thoughtsphere. Maybe I'm just ratonalising.
I ultimately do feel that there is some value in AI as a way of organising data and giving people access to information and happily admit I use it for that purpose. But sheesh, the "business" model.
One of the concepts that AI can't do is 'an idea whose time has come'.
Some of the massive leaps in human understanding were built on a base of ideas that could never be recognised until the idea takes form.
AI can repeat trends and if all the information it scrapes contains some quality input instead of GIGO then it will improve but I can't see how it can 'understand' new little bits make a stronger whole. It will never get the 'eureka' moment but it will explain displacement of fluids.
I think that's true, Graeme.
I think it is.
I'm still trying to get my head around it all.
Tim
Thanks it gives me hope
I'm annoyed but not much because culture has always been about others "scraping" your work, isn't that what influence is? As an artist I have always wanted to influence people and events, I don't care if that happens via AI summarising me, with or without attribution. Surely scraping is the sincerest form of flattery, if I may scrape Oscar Wilde.
One of the comments really hits the nail on the head, because what AI does is summarise it may be able to extrapolate what might come next, but a lifetime as an artist tells me that the future never ever happens like that. The future is always about to be sideswiped by something unpredictable just as open source free socialist AI just decimated the entire US plan to use AI for world domination. I don't think the implications of that has really sunk in yet for most people.
The critical issue now is to clarify the difference between AI as a tool for sharing the commons versus AI as a tool of capitalist control and exploitation. There is hope there given that socialist models (ie like China) are clearly succeeding more than neoliberal models (ie the sinking USA)
Totally agree about "scaping", with maybe a slight proviso about acknowledgement where possible. But sometimes we do not realise what we scrape; we just live in that water.
But yeah, the point about capital exploitation v. socialisation of influences is where the real argument is for me.