We do not understand genetic code as code. We merely have developed some statistical relations between some part of the genetic code and some outcomes, but nobody understands the genetic code good enough to write even the equivalent of "Hello World!".
Gene modification consists of grabbing a slice of genetic code and splicing it into another. Impressive! Means we can edit the code. Doesn't mean we understand the code. If you grab the code for Donkey Kong and put it into the code of Microsoft Excel, does it mean you can throw barrels at your numbers? Or will you simply break the whole thing? Genetic code is very robust and has a lot of redundancies (that we don't understand) so it won't crash like Excel. Something will likely grow. But tumors are also growth.
Remember Thalidomide? They had at the time better reason to think it was safe then we today have thinking gene editing babies is safe.
The tech bros who are gene editing babies (assuming that they are, because they are stupid, egotistical and wealthy enough to bend most laws) are not creating super babies, they are creating new and exciting genetic disorders. Poor babies.
there's been some (what appears to me to be) remarkable progress in the field, in that I know that it's possible to create intentional structures. it's very much not my field so I can't speak to it in detail, I think the best way I could describe where I understand it to be is that it's like people building with lego, if that makes sense?
but yeah it's still a damn far way off from what we'd call "gene programming" as we have "computer programming"
I wouldn’t say that modern computer programming is that hot either. On the other hand, I can absolutely see “no guarantee of merchantability or fitness for any particular purpose” being enthusiastically applied to genetic engineering products. Silicon Valley brought us “move fast and break things”, and now you can apply it to your children, too!
It's not that eugenics is a magnet for white supremacists, or that rich people might give their children an even more artificially inflated sense of self-worth. No, the risk is that the superbabies might be Khan and kick start the eugenics wars. Of course, this isn't a reason not to make superbabies, it just means the idea needs some more workshopping via Red Teaming (hacker lingo is applicable to everything).
The commenter makes and extended tortured analogy to machine learning... in order to say that maybe genes with correlations to IQ won't add to IQ linearly. It's an encapsulation of many lesswrong issues: veneration of machine learning, overgeneralizing of comp sci into unrelated fields, a need to use paragraphs to say what a single sentence could, and a failure to actually state firm direct objections to blatantly stupid ideas.
Working in the field of genetics is a bizarre experience. No one seems to be interested in the most interesting applications of their research. [...] The scientific establishment, however, seems to not have gotten the memo. [...] I remember sitting through three days of talks at a hotel in Boston, watching prominent tenured professors in the field of genetics take turns misrepresenting their own data [...] It is difficult to convey the actual level of insanity if you haven’t seen it yourself.
Like Yudkowsky writing about quantum mechanics, this is cult shit. "The scientists refuse to see the conclusion in front of their faces! We and we alone are sufficiently Rational to embrace the truth! Listen to us, not to scientists!"
Gene editing scales much, much better than embryo selection.
"... Mister Bond."
The graphs look like they were made in Matplotlib, but on another level, they're giving big crayon energy.
Working in the [field] is a bizarre experience. No one seems to be interested in the most interesting applications of their research
depending on field, it might be crackpottery or straight up criminal. but if you post shit like this on linkedin, then it's suddenly "inspiring" and "thought-provoking"
Our knowledge has advanced to the point where, if we had a safe and reliable means of modifying genes in embryos, we could literally create superbabies
Am i misunderstanding the data? No it is all the scientists who are wrong. (He is also ignoring the "scientists" who do agree with him, who all seem to have a special room for ww2 paraphernalia)
So AGI is 0.5-2 years away. After which the singularity happens and due to AI alignment we either are immortal forever, or everybody is diamondoid paperclips.
A normal human takes 18 years to grow to maturity. So for the sake of the argument (yes yes, don't hand it to ISIS) a supergene baby can do that in 9 years. (poor kid). Those timelines seem at odds with each other (and that is assuming the research was possible now).
I know timelines and science fiction stories are a bit fluid but, come on, at least pretend you believe in it. I'm not saying he is full of shit but... no wait, I am saying that.
As we know, the critical age for a boy genius is somewhere from 11 (Harry Potter) to 15 (Paul Atreides), so the gene-enhanced baby ought to have a fair shot after a few months or so.
Superbabies is a backup plan; focus the energy of humanity’s collective genetic endowment into a single generation, and have THAT generation to solve problems like “figure out how to control digital superintelligence
The academic institutions in charge of exploring these ideas are deeply compromised by insane ideologies. And the big commercial entities are too timid to do anything truly novel; once they discovered they had a technology that could potentially make a few tens of billions treating single gene genetic disorders, no one wanted to take any risks; better to take the easy, guaranteed money and spend your life on a lucrative endeavor improving the lives of 0.5% of the population than go for a hail mary project that will result in journalists writing lots of articles calling you a eugenicist.
Superbabies is a backup plan; focus the energy of humanity’s collective genetic endowment into a single generation, and have THAT generation to solve problems like “figure out how to control digital superintelligence”.
Science-fiction solutions for science-fiction problems!
Let's see what the comments say!
Considering current human distributions and a lack of 160+ IQ people having written off sub-100 IQ populations as morally useless [...]
Dude are you aware where you are posting.
Just hope it never happens, like nuke wars?
Yeah that's what ran the Cold War, hopes and dreams. JFC I keep forgetting these are kids born long after 1989.
Could you do all the research on a boat in the ocean? Excuse the naive question.
No, please keep asking the naive questions, it's what provides fodder for comments like this .
(regarding humans having "[F]ixed skull size" and can therefore a priori not compete with AI):
Artificial wombs may remove this bottleneck.
This points to another implied SF solution. It's already postulated by these people that humans are not having enough babies, or rather the right kind of humans aren't (wink wink). If we assume that they don't adhere to the Platonic ideal that women are simply wombs and all traits are inherited from males, then to breed superbabies you need buy-in from the moms. Considering how hard it is for these people to have a normal conversation with the fairer sex, them both managing to convince a partner to have a baby and let some quack from El Salvador mess with its genes seems insurmountable. Artificial wombs will resolve this nicely. Just do a quick test at around puberty to determine the God-given IQ level of a female, then harvest her eggs and implant them into artificial wombs. The less intelligent ones can provide eggs for the "Beta" and "Gamma" models...
But you don't go from a 160 IQ person with a lot of disagreeability and ambition, who ends up being a big commercial player or whatnot, to 195 IQ and suddenly get someone who just sits in their room for a decade and then speaks gibberish into a youtube livestream and everyone dies, or whatever.
this whole "superbabies will save us from AI" presupposes that the superbabies are immune to the pull of LW ideas. Just as LW are discounting global warming, fascism etc to focus on runaway AI, who says superbabies won't have a similar problem? It's just one step up the metaphorical ladder:
LW: "ugh normies don't understand the x-risk of AI!"
Superbabies: "ugh our LW parents don't understand the x-risk of Evangelion being actually, like, real!"