If you made a painting for me, and then I started making copies of it without your permission and selling them off, while I might not have stolen the physical painting, I have stolen your art.
Just because they didn't rip his larynx out of his throat, doesn't mean you can't steal someone's voice.
We're getting into samantics but it's counterfeit not stolen.
It would be more like if you made a painting for me, and I then used that to replicate your artistic style and used that to make new paintings without your permission and passed it off as your work.
I think it's important to remember how this used to happen.
AT&T paid voice actors to record phoneme groups in the 90s/2000s and have been using those recordings to train voice models for decades now. There are about a dozen AT&T voices we're all super familiar with because they're on all those IVR/PBX replacement systems we talk to instead of humans now.
The AT&T voice actors were paid for their time, and not offered royalties but they were told that their voices would be used to generate synthentic computer voices.
This was a consensual exchange of work, not super great long term as there's no royalties or anything and it's really just a "work for hire" that turns into a product... but that aside -- the people involved all agreed to what they were doing and what their work would be used for.
The ultimate problem at the root of all the generative tools is ultimately one of consent. We don't permit the arbitrary copying of things that are perceived to be owned by people, nor do we think it's appropriate to do things without people's consent with their "Image, likeness, voice, or written works."
Artists tell politicians to stop using their music all the time etc. But ultimately until we really get a ruling on what constitutes "derivative" works nothing will happen. An AI is effectively the derivative work of all the content that makes up the vectors that represents it so it seems a no brainer, but because it's radio on the internet we're not supposed to be mad at Napster for building it's whole business on breaking the law.
I think a more interesting (and less dubious) example of this would be Vocaloid and to a greater extent, cevio AI
Vocaloid is a synth bank where instead of the notes being musical instruments, they're phonemes which have been recorded and then packaged into a product which you pay for, which means royalties are involved (I think there might also be a thing with royalties for big performances and whatnot?) Cevio AI takes this a step further by using AI to better smooth together the phonemes and make pitching sound more natural (or not - it's an instrument, you can break it in interesting ways if you try hard enough). And obviously, they consented to that specific thing and get paid for it. They gave Yamaha/Sony/the general public a specific character voice and permission to use that specific voice.
(There's a FOSS voicebanks but that adds a different layer of complication to things like I think a lot of them were recorded before the idea of an "AI bank" was even a possibility. And like, while a paid voice bank is a proprietary thing, the open source alternatives are literally just a big file of .WAVs so it's much easier to go outside their intended purposes)
I don't think permits and concent alone can be used in labor relationship, because the unbalance position of power employees and employers have with each other. Could the workers really negotiate better working conditions? They really can't, not without an union anyway.
Studios basically want to own the personas of their actors so they can decouple the actual human from it and just use their images. There's been a lot of weird issues with this already in videogames with body capture and voice acting, and contracts aren't read through properly or the wording is vague, and not all agents know about this stuff yet. It's very dystopian to think your whole appearance and persona can be taken from you and commodified. I remember when Tupac's hologram performed at Coachella in 2012 and thinking how fucked up that was. You have these huge studios and event promoters appropriating his image to make money, and an audience effectively watching a performance of technological necromancy where a dead person is re-animated.
Among those warning about the technology’s potential to cause harm is British actor and author Stephen Fry, who told an audience at the CogX Festival in London on Thursday about his personal experience of having his identity digitally cloned without his permission.
Speaking at a news conference as the strike was announced, union president Fran Drescher said AI “poses an existential threat” to creative industries, and said actors needed protection from having “their identity and talent exploited without consent and pay.”
As AI technology has advanced, doctored footage of celebrities and world leaders—known as deepfakes—has been circulating with increasing frequency, prompting warnings from experts about artificial intelligence risks.
At a U.K. rally held in support of the SAG-AFTRA strike over the summer, Emmy-winning Succession star Brian Cox shared an anecdote about a friend in the industry who had been told “in no uncertain terms” that a studio would keep his image and do what they liked with it.
Oscar winner Matthew McConaughey told Salesforce CEO Marc Benioff during a panel event at this year’s Dreamforce conference that he had concerns about the rise of AI in Hollywood.
A spokesperson for the Alliance of Motion Picture and Television Producers (AMPTP), the entertainment industry’s official collective bargaining representative, was not available for comment when contacted by Fortune.
The original article contains 911 words, the summary contains 213 words. Saved 77%. I'm a bot and I'm open source!
Get your head out of your ass. Their voices are their art and to replicate that is not only disturbing it’s morally wrong. Especially if you do so for profit.
Nobody complained about copyright when Microsoft had the only image ai in the game, only when the open source stable diffusion came out did they start screeching about how ai was "stealing their jobs".
AI can very easily be abused and I don't see how this is related to the tech being open sourced or not. Fighting to ensure you aren't exploited is fine and I support anyone to fight against exploitation.
I don't get why so many people feel the need to defend big corporations this much. It's not like they're going to share the profits with the people who defend them, nor do they probably care.
If anything, the industry will just use whatever ammo they can to exploit more people.
Without maintaining and creating protections, they will roll back until there are almost none. Our current labor rights didn't come for free, they were fought for.
It's funny how all (or at least most of them) of the parents of those "artists" told them to do/learn something real and now they get their recipe for their bad choice.
I've discussed with someone about how pictures made by stable diffusion is not Art while there are literally "paintings" where the "artist" just jizzed on the canvas which then got declared as Art. I trolled him by sending him multiple generated anime pictures and asked him which is "Art" because he said he could recognize Art. He chose one and fell into the trap.
ChatGPT is why the public is scrambling about AI. AI art has been around awhile and there's always been complaining because its lame compared to real artists. This has fuck all to do with it suddenly being open source AI.
See, I'm pulling the smartest move right now: AI can't take your job if you use AI to take your own job first.
Besides, I think Hollywood is pretty behind on tech overall. The current state of the art voice generator quality is still pretty bad, it'll be a very long time before it can replace actors in quality (if ever): if you train the AI voice on audiobooks, the generated voice is going to sound like someone narrating an audiobook, which really doesn't sound natural for dialogues at all.
I think then the key point isn't to ban generative transformer based AI: once the tech out of its box, you can't exactly put it back in again. (heh) The real question to ask is, who should own this technology so that it does good and help people in the world, instead of being used to take away people's livelihood?
Wrong. The real question is why do we presuppose that the output of creatively driven individuals must generate profit for a capitalist economy to have sufficient value that those people be permitted the basic necessities of life? Frankly I suspect most of our most valuable contributors to culture are never given the opportunity to be bad enough long enough to develop into their potential.
This whole "oh no, AI is going to take away our liveihoods" notion fundamentally accepts the false notion that people are only deserving of a functional life so long as the primary activities of that life is ultimately to contribute towards increasing the wealth of a tiny percentage of individuals.
It's the same mistake that leads us to massively undersupport educators and carers and will have people freaking out about how they'll "earn a living" once robots are able to do everything we practically require to be done.
People are fundamentally entitled to a living. If someone is being denied one, then look at the system that causes that not the specifics of that particular flavour of how it's happening.
Since it is paywalled I can only guess from the title.
I don't understand the problem. He was payed for reading books and now we all have his voice. What did he expect?
Is there an AI imitating his voice making money? Is it being represented with his name? If not, what would be the difference with some person imitating his voice, whould that be stealing too?
Basically I don't see any problem with me buying those books training local model and give it other books to read. That can not be illegal, right?
Giving it to other people mentioning his name would definitely be fraud. But stealing? I don't know.
Selling it to other people under other name... I don't see a problem.
But than we come to AI generated images and I do start thinking in that way. Thou if they can find someone that looks like him, and other person sounding like him... they are all good?
This is from a guy who advocates Linux as it is Open Source!
The only violation here would be if another used that voice claiming it to be Fry. That would be fraud. Otherwise there is no issue.