In an interview with Rolling Stone, Scott, who has directed several movies featuring AI, was asked if the technology worried him. He says he's always believed the...
Blade Runner director Ridley Scott calls AI a "technical hydrogen bomb" | "we are all completely f**ked"::undefined
I'm sure that a film director is an expert on the technical underpinnings of large language models, which primarily are used to generate blocks of text that have the appearance of being coherent.
Several departments where I work had massive layoffs in favour of implementing customized versions of GPT4 chatbots (both client facing services and internal stuff). That’s just the LLM end of AI.
That’s not even considering the generative image spectrum of AI. I fear for my companies graphics, web design, and UX/UI teams who will probably be gone this time next year.
I work freelance but occasionally needed to partner with artists and other stuff. But I now use various “ai” projects and no longer need to pay people to do the with as the computer can do it good enough.
I’m not some millionaire, I’m just a guy trying to save money to buy a house one day, so it’s not like a large economic impact, but I can’t be the only one.
I can tell you now that AI won't come for UX/UI teams, at least not in the near future. Clients rarely are able to really articulate what they need out of software and until AI is smart enough to suss that out, we're good. That being said, I'm sure there will be companies that try to go that route but I doubt it will work, again, in the near term.
I use Copilot in my work, and watching the ongoing freakout about LLMs has been simultaneously amusing and exhausting.
They're not even really AI. They're a particularly beefed-up autocomplete. Very useful, sure. I use it to generate blocks of code in my applications more quickly than I could by hand. I estimate that when you add up the pros and cons (there are several), Copilot improves my speed by about 25%, which is great. But it has no capacity to replace me. No MBA is going to be able to do what I do using Copilot.
As for prose, I've yet to read anything written by something like ChatGPT that isn't dull and flavorless. It's not creative. It's not going to replace story writers any time soon. No one's buying ebooks with ChatGPT listed as the author.
Saying this is like saying your a particularly beefed-up bacteria. In both cases they operate on the same basic objective, survive and reproduce for you and the bacteria, guess the next word for llm and auto-complete, but the former is vastly more complex in the way it achieves those goals.
Yes, I thought he was talking about the film industry ("we're fucked") and how AI is/would be used in movie. In which case he would be competent to talk about it.
But he's just confusing science-fiction and reality. Maybe all those ideas he's got will make good movies, but they're poor predictions.
Seriously, he's a director that made sci-fi movies. He has no qualifications whatsoever to answer this question. Of course, this will still rile up the critical thinking challenged crowd.
I used to think he completely lost it when he had the characters acting so dumb in his recent Alien universe films, for example when the crew of prometheus took off their helmets, but then watching how large parts of society acted with covid I am now not sure.
Humans repeatedly make bad choices, somebody is going to be really really dumb with their AI implementation when it gets to the level of actually being able to manage things.
And yet 90% of the population still has an anchoring bias due to the projections about AI people like him, Cameron, and all the rest of the Sci-Fi contributors made over the years.
I may not be a computer scientist in real life, but I directed a movie based on a short story written by someone else who isn't a computer scientist in real life.
Yes, because we should all take note of what the art student says about AI. This guy is, essentially, a clown in this field. Why should we listen to him?
He may not be an expert, but he has had thoughts of different future scenarios resulting from technology for probably about 60 years. Although society has developed in that time, the basics remain the same. His thoughts on AI aren't new, it's the fact that AI is moving fast now that is new.
Christ, a good litmus test is that anyone who says "I'm afraid of AI because...' and then describes the end of modern civilization/the world can be dismissed.
This man's argument is literally "you could ask AI how to turn off all the electricity in Britain and then it would do it." Goddam.
I'm less afraid of the tech itself, and more afraid of how it can be weilded by power-hungry human beings. As long as it never has desires of it's own, we're fine
When the camera was invented, a lot of comercial artists lost their jobs. Why print an ad featuring a realistic drawing of your car, when you could just run a photograph?
People say they hate modernism, but it's a direct result of the photograph. Artists had to create things a photographer couldn't. What's the point of realism if it can be recreated effortless with the press of a button?
I do wonder what jobs AI will replace and what jobs they'll create? How will this change the art world? Will artists start to incorporate text and hands with the right amount of fingers into everything they do? Maybe human artists scede all digital media to AI, instead focusing on physical pieces.
Simpler jobs. There's doomerism among the discussion of AI and it seems like there are three camps. People who are scared of AI based on movies. People who have technical knowledge and know machines are still stupid, and people with knowledge who know some fuckwit with too much power is going to give the AI access with too much power. Just don't give them access to nuke codes
Are you saying that Philip K Dick stole the story for Electric Sheep? Or that the film took is story from somewhere else and credited it as a Philip K Dick story?
AI will probably be the final and ultimate achievement of humanity. When we have created true strong AI, the path is clearly towards the irrelevancy of human kind.
It's not that we will cease to exist, but we will not remain top of the ladder for long after that. Our significance will be comparable to dogs.
AI is just humanity evolved. Why be afraid of a better humanity? We don't need to be flesh beings thrust out into this world from a wet slimy torn vagina or incision in the abdomen of a woman who severely regretted getting pregnant.
How is this existence better than what humanity will he through AI?
I never claimed any emotional attachment to human kind being dominant.
I think chances are good AI will still help us, even when machine intelligence doesn't need humanity anymore. Kind of like how we try to preserve history and nature we find worthy.
The above is by no means a doomsday prediction, but rather my understanding of how things will naturally evolve.
We may prolong our relevance with implants, but ultimately that too will be inferior to self improving AI.
We can absolutely call AI our children, and our children will surpass us.
By "Bladerunner", do you mean the movie that stole its plot and characters from previous books without giving any acknowledgement to the authors? That "Bladerunner"?
They tried to hide the fact its just a movie adaptation of do Androids dream is electric sheep? Never heard that before. That seems weird, especially since a lot of the books sold now often use the blade runner name.
I don't think they tried to hide that fact, and also it's very different from DADoES too. They're generally the same story with characters using the same names and stuff, but they have different focuses.
Mr. Jeff Walker,
The Lada Company,
4000 Warner Boulevard,
Burbank, Calif. 91522.
Dear Jeff:
I happened to see the Channel 7 TV proyram "Hooray For Hollywood" tonight with the segment on BLADE RUNNER. (Well, to be honest, I didn't happen to see it; someone tipped me off that BLADE RUNNER was going to be a part of the show, and to be sure to watch.) Jeff, after looking --and especially after listening to Harrison Ford discuss the film-- I came to the conclusion that this indeed is not science fiction; it is not fantasy; it is exactly what Harrison said: futurism. The impact of BLADE RUNNER is simply going to be overwhelming, both on the public and on creative people -- and, I believe, on science fiction as a field. Since I have been writing and selling science fiction works for thirty years, this is a matter of some importance to me. In all candor I must say that our field has gradually and steadily been deteriorating for the last few years. Nothing that we have done, individually or collectively, matches BLADE RUNNER. This is not escapism; it is super realism, so gritty and detailed and authentic and goddam convincing that, well, after the segment I found my normal present-day "reality" pallid by comparison. What I am saying is that all of you collectively may have created a unigue new form of graphic, artistic expression, never before seen. And, I think, BLADE RUNNER is going to revolutionize our conceptions of what science fiction is and, more, can be.
Let me sum it up this way. Science fiction has slowly and ineluctably settled into a monotonous death: it has become inbred, derivative, stale. Suddenly you people have come in, some of the greatest talents currently in existence, and now we have a new life, a new start. As for my own role in the BLADE RUNNER project, I can only say that I did not know that a work of mine or a set of ideas of mine could be escalated into such stunning dimensions. My life and creative work are justified and completed by BLADE RUNNER. Thank you...and it is going to be one hell of a commercial success. It will prove invincible.
Yes, the systems that we created and control are running rampant. Did you see the Spanish model? There'll be an army of incels worshipping ChatGPT by week's end! RUN!
I think AI advances will continue to be just fast enough to have occasional "punctuation points" of short-lived buzz in the media. For example, I can see it getting good enough (and easy enough to use) that average normies will be able to create their own movies and games with it.
But, AI advances will remain slow enough to lull people into apathy about it (like global warming). It will very gradually encroach into more and more embedded systems, infrastructure, and cloud resources.
And at some point after that, it will accelerate in sudden and unexpected ways. I don't know if it will be a good thing or a bad thing when that happens. But considering how many tech bros and executives are sociopaths with no ethics, I'm not very optimistic it will be a good thing.
I think that this has been grossly overblown with regards to the available ‘AI’ related stuff. Sure some of it it cool, but a lot of it isn’t ready to be a real product. It amazes me that all these companies are will to put themselves liable for what these things will undoubtably say.
A lot of the AI are just tools, good when used right, bad when used badly.
I think the problem, is with how fast this is moving in regard to both software and hardware and how accessible something that could easily be weaponized to normal people.
That's the potential for trouble I can see, for example Auto-GPT with improvements on the tool itself and its dependencies could make for something pretty powerful that could be set loose with a very wide remit and absolutely no limit to what it will do, to achieve its goal.
Time will tell of course. But this is probably one of, if not the fastest moving topic in tech we've ever had.