I do understand why so many people, especially creative folks, are worried about AI and how it’s used. The future is quite unknown, and things are changing very rapidly, at a pace that can feel out…
I do understand why so many people, especially creative folks, are worried about AI and how it’s used. The future is quite unknown, and things are changing very rapidly, at a pace that can feel out…
Statistical analysis of existing literary works is certainly not the same sort of thing as generating new literary works based on models trained on old ones.
Almost all of the people who are fearful that AI is going to plagiarize their work don't know the difference between statistical analysis and generative artificial intelligence. They're both AI, and unfortunately in those circles it seems anything even AI-related is automatically bad without any further thought.
I wouldn't characterize statistical analysis as "AI", but sadly I do see people (like those authors) totally missing the differences.
I'm generally hesitant about AI stuff (particularly with the constant "full steam ahead, 'disrupt' everything!" mindset that is far too prevalent in certain tech spheres), but what I saw described in this article looks really, really cool. The one bit I'm hesitant about is where actual pages are presented (since that is actually presenting a segment of the text), but other than that it's really sad to see this project killed by a massive misunderstanding.
And yet it was attacked. The reality is content creators have only contempt for the concept of fair use. Another example is copyright strikes on unfavorable reviews.
AI people just love to disingenuously claim that anybody who criticizes AI "fears" the technology. This is their way of dismissing all critics or skeptics as luddites, and is usefully based entirely on their desire to profit somehow off of the trend.
Artists don't "fear" AI... They simply want big tech billionaires to stop stealing their copyrighted art works or other intellectual property in the hopes of generating infinite junk "content".
If you want artists to embrace AI, then you'd better be willing to stay paying them to license their artwork for AI training.
Your comment doesn't appear to apply to this article at all. It explicitly states that this tool was neither stealing copyrighted art nor a billionaire funded venture.
In this case it really was the unfounded fear of AI that killed a useful tool via misplaced outrage.
My brother in Christ, if I steal all of your writings and art when you're not looking, chop them up, eat them, and shit them out, they are still your creations-- just now covered in shit, garbled up, and without your original thoughts and intentions put behind them. If I then sell the pile of shit to someone, I am profiting from your labor.
I would be less inclined to hate this if I got some form of royalty or even some form of compensation for the hours and hours I've spent planning, creating, editing, and studying to make my things.
There are also financial incentives to oppose the adoption of content generating AI. As the spinning jenny replaced hand spinning and electric trolleys replaced horse drawn streetcars, there was always strong financially motivated opposition. How is it different this time?
Mechanical inventions of the past were invented, designed and implemented by people who had a unique idea for how to better accomplish some task. If part(s) of their invention was already patented by someone else, then they would be required to either license that patent or find another novel approach.
Machine learning AI doesn't work that way. In order to produce any result (let alone a good one) it must be "trained" on a dataset of other people's works, or peoples faces, or whatever (depending on the desired result). All i ask is that people (artists, writers, musicians, etc) are fairly and regularly compensated when their copyrighted work is used to train AI.
Anything else is exploitation on an industrial scale.
darn, this is kinda sad. This is like research on existing works, rather than generating new ones and potentially exploiting them without attribution. It’s like another way of consuming and interpreting the content, much like how we read/watch books/movies and interpret them. We really are moving too quickly and it’s hard to have these conversations in a meaningful way.
So THIS is the article that has all those writers on Bluesky ranting.
For me, I don't see HOW this is a useful tool at all. It's.. a word counter. It counts the number of times you use a word. Someone had a screencap of his "vividness" rankings on words, and it had placed "wintery" at a higher score than "permafrost." Why? How does it know that one word is more vivid than the other? what's the standard here? This sort of thing is very subjective.
And he starts with Vonnegut's shape of stories, but an LLM can't recognize rising and falling action so how could it do such a comparison?
Honestly, the WHOLE thing sounds like he's trying to create a formula for good writing, and you can't pin down good writing like that.
This is not a useful tool. It's a tool that will get people caught in the weeds like they do with narrative outlines like the Hero's Journey and lists of tropes. It will churn out a bunch of writers people don't like who can't understand why they don't catch on when they are following all the rules.
This is sad but understandable. Authors, most of whom don't make enough money to call it a career, are being kicked from every side. In just the last handful of years, you have AI companies training on their works, companies demonstrating they're open to replacing writers with AI, the Internet Archive giving their books away for free, states trying to ban more and more books, etc. When you're kicked enough, everything looks like a threat.
Similar to music, I imagine there's going to need to be some shift in the industry but I don't think we've seen what that is yet. Patreon, physical merch, and live performances just don't seem to work as well for authors as they do for musicians.
That being said, this particular site is clearly fair use and I'm surprised AI was even mentioned anywhere in the conversation.
Maybe an unpopular opinion, but it's not the industries that needs to change, but rather the paradigm of our economic system. Advanced technologies are not going anywhere, are only going to get more advanced, and are only going to be regulated in ways to continue funneling money to the wealthy. Anybody who says technology will never be able to do {x} is fighting a losing battle.
@jrburkh Thing is, you can't just tell a guy who's trying to scrape together enough for food that "We need to change the paradigm of our economic system." That's not a thing that can be done quickly or effectively right now, and writers need to protect their income NOW. The only thing that can be done is for them to aggressively protect their rights while lobbying the governments so they don't die while waiting for reform.
The tool is kinda bad for writers, rather than good, but it was totally done right. It didn't do anything to republish or redistribute at all. The complaints are akin to someone objecting to a critic rating their book by objective means.
Shit, if anyone gave a shit about my books, I'd godmother volunteer them for the guy to use.
I don't really understand the tool itself especially as I am not a writer, but if you’re going to make this argument, you have to actually make the much harder argument that after a decade of the gig economy we should trust these tech bros to not lobby against mitigating the downside, or to trust institutions to disregard their lobbying
I would even go as far as saying: an AI that trains on released books and can write new texts shouldn't be seen as bad either. Yes there is a lingering question about compensating writers in some way, but looking at how these tools work basically makes you realize, it will never generate a text as good as an original writer on its own. It can only ever be less than all the median of all the works collected. And it will not store the original works, it only learns the style they are written in.
And I'm kind of scared as well. If we don't make AI happen and figure out the right monetization systems for it, another country will, and they might give zero fucks and start crawling the works anyway. We'll just lose the upper hand on the development.
And I am saying that as someone who will be on the other end too, soon. I am a music producer, developer and I do 3d compositions. I am a bit scared of what's about to change but mostly just stoked to see what different aspects of my work will become more and less important.
I still believe change is good. And I always will.