"I want to live forever in AI"
"I want to live forever in AI"
cross-posted from: https://lemmy.ml/post/14869314
"I want to live forever in AI"
"I want to live forever in AI"
cross-posted from: https://lemmy.ml/post/14869314
"I want to live forever in AI"
Consciousness and conscience are not the same thing, this naming is horrible
This just makes it more realistic
Hey, just be glad I changed it from asdf_test_3, okay?
If anyone's interested in a hard sci-fi show about uploading consciousness they should watch the animated series Pantheon. Not only does the technology feel realistic, but the way it's created and used by big tech companies is uncomfortably real.
The show got kinda screwed over on advertising and fell to obscurity because of streaming service fuck ups and region locking, and I can't help but wonder if it's at least partially because of its harsh criticisms of the tech industry.
Just FYI content warning for Pantheon there is a seriously disturbing gore/kill scene that is animated too well in the first season. Anyone who has seen the show knows what scene I am talking about, I found the scene pretty upsetting and I almost didn't finish the show. I am still a little upset that the scene is burned in my memory.
I really thought you were going to mention "Upload" on Prime. Same creator as the office.
Yes, I just finished watching Pantheon and absolutely loved it!
Totally agree that it deserved more attention. At least it got a proper ending with season 2.
Also, the voice acting talent they got was impressive. Paul Dano was fantastic as one of the leads.
Checking in to see if this show was mentioned. Highly recommend! Well written
The game SOMA represents this case the best. Highly recommended!
Yes, I immediately thought about SOMA after reading the post. recommendations++
Soma is so fucking bleak and I love it
Did they ever allow for turning off head bob and blur? That game makes me motion sick to an insane degree.
Soma is a wonderful game that covers this type of thing. It does make you wonder what consciousness really is... Maybe the ability to perceive and store information, along with retrieving that information, is enough to provide an illusion of consistent self?
Or maybe it's some competely strange system, unkown to science. Who knows?
I don't think anything gave me existential doom quite as much as the ending of that game.
I think the definition of consciousness needs to not be solely about abilities or attributes. It needs to account for the active process of consciousness. Like a hair dryer can burn things.. but a fire is things burning. Without the active nature its simply not conscious.
The comic sans makes this even deeper
What if you do it in a ship of theseus type of way. Like, swapping each part of the brain with an electronic one slowly until there is no brain left.
Wonder if that will work.
If I remember right, the game The Talos Principle calls that the Talos principle
The tv show Pantheon figures it will work, but it will be very disturbing.
Was looking for the Pantheon reference in this thread! Just finished that show and loved it. Of course it takes plenty of liberties for the sake of the storytelling, but still, at least it explores these interesting topics!
Anyone reading this thread, do yourself a favor and check out Pantheon!
Right? Like what if as cells die or degrade instead of being replaced by the body naturally they are replaced by nanites/cybernetics/tech magic. If the process of fully converting took place over the course of 10 years, then I don't see how the subject would even notice.
It's an interesting thing to ponder.
would've made more sense if it was rust
(or is the copy intential here?)
Plottwist: consciousness is : Copy
It's pinned and !Unpin
, and only has private constructors.
Uploading is a matter of implementing Clone
rust
#[derive(Clone, Copy)] struct Consciousness {...} fn upload_brain(brain: Consciousness) -> Result<(), Error>
If we're gonna have a dystopian future, then damn it, it's gonna be memory safe.
The semantics in Rust would be completely out of wack. What does ownership mean?
I guess the point of the joke is that consciousness is a shallow value.
A copy is fine. I can still seek vengeance on my enemies from beyond the grave.
It's definitely an improvement to just being plain old dead
throws UserNotPaidException
the plot of ::: spoiler spoiler SOMA ::: in a nutshell?
Related book recommendation!!
Kil'n People by David Brin - it's a futuristic Murder Mystery Novel about a society where people copy their consciousnesses to temporary clay clones to do mundane tasks for them. Got some really interesting discussions about what constitutes personhood!
Some of the concepts in this book really stuck with me, but I had no idea what the title was! Thanks!
"Some days you're the original, some days you're the copy" or something like that
There's a cool computer game that makes this point as part of the story line... I'd recommend it, but I can't recommend it in this context without it being a spoiler!
Guys probably talking about
Lost the coin flip.
There's also a book with a similar concept. It's not the focus until later in the book though. It's called
I've had this thought and felt it was so profound I should write a short story about it. Now I see this meme and I feel dumb.
I saw a great comic about it once, one sec
Edit: more focused on teleportation, but a lot of the same idea. Here https://existentialcomics.com/comic/1
That's why I'm going for brain in a jar.
void teleport(Person person);
The best part is, unless that function name is misleading, it doesn't matter how the data is passed; a copy is being sent out over TCP/IP to another device regardless.
I had to turn my phone sideways and go cross-eyed to spot the difference.
Sorry Dave, I'm afraid I can't do that
I don't get it
The joke is that there are some people who think that by uploading themselves into a machine "to live forever," their consciousness will also be transferred, like when you travel by bus from one city to another. In reality, you "upload yourself," but that yourself is not you, but a copy of you. So, once the copy is done, you will still be in your original body, and the copy will "think" it is you, but it's not you. It's a copy of you! So, you continue to live in your body until you die, and, well, for you - that's it. You're dead. You're not living. You're finished. Everything is black. Void. Null. Done - unless you believe in the afterlife, so you'll be in heaven, hell, purgatory or whatever, but the point is, you're not longer on Earth "living forever." That's just some other entity who thinks it is you, but it's not you (again, because you're dead.)
This is represented by the parameters being passed by value (a copy) instead of by reference (same data) in the poster's image.
It wouldn't be you, it would just be another person with the same memories that you had up until the point the copy was made.
When you transfer a file, for example, all you are really doing is sending a message telling the other machine what bits the file is made up of, and then that other machines creates a file that is just like the original - a copy, while the original still remains in the first machine. Nothing is even actually transferred.
If we apply this logic to consciousness, then to "transfer" your brain to a machine you will have to make a copy, which exist simultaneously with the original you. At that point in time, there will be two different instances of "you"; and in fact, from that point forward, the two instances will begin to create different memories and experience different things, thereby becoming two different identities.
The first line passes the argument by reference, ie, the object itself.
The second line passes the object by value, ie, a copy.
Also in Rust that would be the opposite which is funny but confusing
Thank
Are you sure the roon of today is a reference to yesterday's roon?
What needs to happen for it to actually work.
c++
bool uploadConsciousness(Consciousness&& Conscience) {
everyone watch this clip and tell me what you think
https://www.youtube.com/watch?v=szzVlQ653as
what if it's year 3000 right now and we're all playing a game?
Intellisense commands you to fix your method
Even if it were possible to scan the contents of your brain and reproduce them in a digital form, there's no reason that scan would be anything more than bits of data on the digital system. You could have a database of your brain... but it wouldn't be conscious.
No one has any idea how to replicate the activity of the brain. As far as I know there aren't any practical proposals in this area. All we have are vague theories about what might be going on, and a limited grasp of neurochemistry. It will be a very long time before reproducing the functions of a conscious mind is anything more than fantasy.
Counterpoint, from a complex systems perspective:
We don't fully know or are able toodel the details of neurochemistry, but we know some essential features which we can model, action potentials in spiking neuron models for example.
It's likely that the details don't actually matter much. Take traffic jams as an example. There is lots of details going on, driver psychology, the physical mechanics of the car etc. but you only need a handful of very rough parameters to reproduce traffic jams in a computer.
That's the thing with "emergent" phenomena, they are less complicated than the sum of their parts, which means you can achieve the same dynamics using other parts.
Even if you ignore all the neuromodulatory chemistry, much of the interesting processing happens at sub-threshold depolarizations, depending on millisecond-scale coincidence detection from synapses distributed through an enormous, and slow-conducting dendritic network. The simple electrical signal transmission model, where an input neuron causes reliable spiking in an output neuron, comes from skeletal muscle, which served as the model for synaptic transmission for decades, just because it was a lot easier to study than actual inter-neural synapses.
But even that doesn't matter if we can't map the inter-neuronal connections, and so far that's only been done for the 300 neurons of the c elegans ganglia (i.e., not even a 'real' brain), after a decade of work. Nowhere close to mapping the neuroscientists' favorite model, aplysia, which only has 20,000 neurons. Maybe statistics will wash out some of those details by the time you get to humans 10^11 neuron systems, but considering how badly current network models are for predicting even simple behaviors, I'm going to say more details matter than we will discover any time soon.
I heard a hypothesis that the first human made consciousness will be an AI algorithm designed to monitor and coordinate other AI algorithms which makes a lot of sense to me.
Our consciousness is just the monitoring system of all our bodies subsystems. It is most certainly an emergent phenomenon of the interaction and management of different functions competing or coordinating for resources within the body.
To me it seems very likely that the first human made consciousness will not be designed to be conscious. It also seems likely that we won't be aware of the first consciousnesses because we won't be looking for it. Consciousness won't be the goal of the development that makes it possible.
We don't even know what consciousness is, let alone if it's technically "real" (as in physical in any way.) It's perfectly possible an uploaded brain would be just as conscious as a real brain because there was no physical thing making us conscious, and rather it was just a result of our ability to think at all.
Similarly, I've heard people argue a machine couldn't feel emotions because it doesn't have the physical parts of the brain that allow that, so it could only ever simulate them. That argument has the same hole in that we don't actually know that we need those to feel emotions, or if the final result is all that matters. If we replaced the whole "this happens, release this hormone to cause these changes in behavior and physical function" with a simple statement that said "this happened, change behavior and function," maybe there isn't really enough of a difference to call one simulated and the other real. Just different ways of achieving the same result.
My point is, we treat all these things, consciousness, emotions, etc, like they're special things that can't be replicated, but we have no evidence to suggest this. It's basically the scientific equivalent of mysticism, like the insistence that free will must exist even though all evidence points to the contrary.
Also, some of what happens in the brain is just storytelling. Like, when the doctor hits your patellar tendon, just under your knee, with a reflex hammer. Your knee jerks, but the signals telling it to do that don't even make it to the brain. Instead the signal gets to your spinal cord and it "instructs" your knee muscles.
But, they've studied similar things and have found out that in many cases where the brain isn't involved in making a decision, the brain does make up a story that explains why you did something, to make it seem like it was a decision, not merely a reaction to stimulus.
This right here might already be a flaw in your argument. Something doesn’t need to be physical to be real. In fact, there’s scientific evidence that physical reality itself is an illusion created through observation. That implies (although it cannot prove) that consciousness may be a higher construct that exists outside of physical reality itself.
If you’re interested in the philosophical questions this raises, there’s a great summary article that was published in Nature: https://www.nature.com/articles/436029a
Consciousness might not even be “attached” to the brain. We think with our brains but being conscious could be a separate function or even non-local.
I read that and the summary is, "Here are current physical models that don't explain everything. Therefore, because science doesn't have an answer it could be magic."
We know consciousness is attached to the brain because physical changes in the brain cause changes in consciousness. Physical damage can cause complete personality changes. We also have a complete spectrum of observed consciousness from the flatworm with 300 neurons, to the chimpanzee with 28 billion. Chimps have emotions, self reflection and everything but full language. We can step backwards from chimps to simpler animals and it's a continuous spectrum of consciousness. There isn't a hard divide, it's only less. Humans aren't magical.
Thank you for this. That was a fantastic survey of some non-materialistic perspectives on consciousness. I have no idea what future research might reveal, but it's refreshing to see that there are people who are both very interested in the questions and also committed to the scientific method.
I think we're going to learn how to mimic a transfer of consciousness before we learn how to actually do one. Basically we'll figure out how to boot up a new brain with all of your memories intact. But that's not actually a transfer, that's a clone. How many millions of people will we murder before we find out the Zombie Zuckerberg Corp was lying about it being a transfer?
What's the difference between the two?
ChatGPT is not conscious, it's just a probability language model. What it says makes no sense to it and it has no sense of anything. That might change in the future but currently it's not.
🥱
The only people with this take are people who don't understand it. Plus growth and decline is an inherent part of consciousness, unless the computer can be born, change then die in some way it can't really achieve consciousness.