It would harm the A.I. industry if Anthropic loses the next part of the trial on whether they pirated books — from what I’ve read, Anthropic and Meta are suspected of getting a lot off torrent sites and the like.
It’s possible they all did some piracy in their mad dash to find training material but Amazon and Google have bookstores and Google even has a book text search engine, Google Scholar, and probably everything else already in its data centers. So, not sure why they’d have to resort to piracy.
I agree that we need open-source and emancipate ourselves. The main issue I see is: The entire approach doesn't work. I'd like to give the internet as an example. It's meant to be very open, connect everyone and enable them to share information freely. It is set up to be a level playing field... Now look what that leads to. Trillion dollar mega-corporations, privacy issues everywhere and big data silos. That's what the approach promotes. I agree with the goal. But in my opinion the approach will turn out to lead to less open source and more control by rich companies. And that's not what we want.
Plus nobody even opens the walled gardes. Last time I looked, Reddit wanted money for data. Other big platforms aren't open either. And there's kind of a small war going on with the scrapers and crawlers and anti-measures. So it's not as if it's open as of now.
A lot of our laws are indeed obsolete. I think the best solution would be to force copy left licenses on anything using public created data.
But I'll take the wild west we have now with no walls then any kind of copyright dystopia. Reddit did successfully sell it's data to Google for 60 million. Right now, you can legally scrape anything you want off reddit, it is an open garden in every sense of the word (even if they dont like it). It's a lot more legal then using pirated books, but Google still bet 60 million that copyright laws would swing broadly in their favor.
I think it's very foolhardy to even hint at a pro copyright stance right now. There is a very real chance of AI getting monopolized and this is how they will do it.
I agree a copyright dystopia wouldn't be any good. Just mind that wild west or law of the jungle is the "right of the strongest". You're advantaging big companies and disadvantaging smaller players or people with ethics or who are more open/transparent.
And I don't think legality with web scraping is the biggest issue. Sure I maybe could do it if it were possible. But I'm occasionally doing some weird stuff and most services have countermeasures in place. In reality I just can't scrape Reddit. Lot's of bots and crawlers just don't work any more. I'm getting rate limited left and right from all big platforms. Lots of things require an account these days, and services are quick banning me for "suspicious activity". It's barely possible to download Youtube videos these days. So, no. I can't. While Google can just pay for it and have the data.
Also Reddit isn't really the benevolent underdog here. They're a big company as well. And they're not selling their data... They're selling their user's data. They're mainly monetizing other people's creations.
If you aren't allowed to freely use data for training without a license, then the fear is that only large companies will own enough works or be able to afford licenses to train models.
The companies like record studio who already own all the copyrights aren't going to pay creators for something they already owned.
All the data has already been signed away. People are really optimistic about an industry that has consistently fucked everyone they interact with for money.
It is entirely possible that the entire construct of copyright just isn't fit to regulate this and the "right to train" or to avoid training needs to be formulated separately.
The maximalist, knee-jerk assumption that all AI training is copying is feeding into the interests of, ironically, a bunch of AI companies. That doesn't mean that actual authors and artists don't have an interest in regulating this space.
The big takeaway, in my book, is copyright is finally broken beyond all usability. Let's scrap it and start over with the media landscape we actually have, not the eighteenth century version of it.
I'm fairly certain this is the correct answer here. Also there is a seperation between judicative and legislative. It's the former which is involved, but we really need to bother the latter. It's the only way, unless we want to use 18th century tools on the current situation.
Yes. But then do something about it. Regulate the market. Or pass laws which address this. I don't really see why we should do something like this then, it still kind of contributes to the problem as free reign still advantages big companies.
(And we can write in law whatever we like. It doesn't need to be a stupid and simplistic solution. If you're concerned with big companies, just write they have to pay a lot and small companies don't. Or force everyone to open their models. That's all options which can be formulated as a new rule. And those would address the issue at hand.)
I've been pirating since Napster, never have hidden shit. It's usually not a crime, except in America it seems, to download content, or even share it freely. What is a crime is to make a business distributing pirated content.
I know but you see what they're doing with ai, a small server used for piracy and sharing is punished, in some cases, worse than a theft. AI business are making bank (or are they? There is still no clear path to profitability) on troves pirated content. This (for small guys like us) is not going to change the situation. For instance, if we used the same dataset to train some AI in a garage and with no business or investor behind things would be different. We're at a stage where AI is quite literally to important to fail for somebody out there. I'd argue that AI is, in fact going to be shielded for this reason regardless of previous legal outcomes.
Agreed. And even if it were, it's always like this. Anthropic is a big company. They likely have millions available for good lawyers. While the small guy hasn't. So they're more able to just do stuff and do away with some legal restrictions. Or just pay a fine and that's pocket change for them. So big companies always have more options than the small guy.