the [simulated] are a convenient group of people to advocate for
I read this quote today, and it resonated:
"The unborn” are a convenient group of people to advocate for. They never make demands of you; they are morally uncomplicated, unlike the incarcerated, addicted, or the chronically poor; they don’t resent your condescension or complain that you are not politically correct; unlike widows, they don’t ask you to question patriarchy; unlike orphans, they don’t need money, education, or childcare; unlike aliens, they don’t bring all that racial, cultural, and religious baggage that you dislike; they allow you to feel good about yourself without any work at creating or maintaining relationships; and when they are born, you can forget about them, because they cease to be unborn. You can love the unborn and advocate for them without substantially challenging your own wealth, power, or privilege, without re-imagining social structures, apologizing, or making reparations to anyone. They are, in short, the perfect people to love if you want to claim you love Jesus, but actually dislike people who breathe. Prisoners? Immigrants? The sick? The poor? Widows? Orphans? All the groups that are specifically mentioned in the Bible? They all get thrown under the bus for the unborn. - David Barbary, Methodist pastor
It certainly rings true for white American evangelicals, but it quickly occurred to me it applies pretty well to longtermists too. Centering the well-being of far-future simulated super-humans repulses me, but it seems very compelling to the majority of the EA cult.
Maybe I'm paranoid but I can't help but feel that the recent spate of "omg people have having too few children!" on HN is just another way to promote anti-abortion policies to the non-religious.
@saucerwizard Rationalism overlaps with TESCREAL and stuff like Extropianism and Cosmism which was invented by a straight-up Russian Orthodox theologian and philosopher, Nikolai Federov, in the late 19th century, to provide a teleological imperative for space colonization. It borrowed its structural skeleton from Christianity.
The rapture *of* the Nerds, and I got the phrase from Ken MacLeod, who says he got it from someone else (but forgot who it was).
The design patterns of Christian fundamentalism show up strongly in singularitarianism (minus the God'n'Jeezus show). Not surprising given its ancestry lies in Russian Cosmism.
So advocating for "the uploaded" is like advocating for the souls of the elect in heaven, after the ain't-happened-yet Rapture.
Perhaps present-day humans are more obviously aided by questioning literally any aspect of hyper-capital. Better to cast out to the far future and insist (without any real basis) that fellating billionaires is the best course.
Perhaps the beneficiaries of the most efficient public health interventions (the previous focus of the movement) are somehow more difficult for them to identify with...
The [un]simulated, with the extra icky purpose of presenting of veneer of ethics to back any an all arguments under the sun, to pour money into the latest fad that tickles a billionaire's fancy.
You can't quite (yet) do that with pro-life advocacy.
I spend a lot of time campaigning for animal rights. These criticisms also apply to it but I don't consider it a strong argument there. EA's spend an estimated 1.8 million dollar per year (less than 1%, so nowhere near a majority) on "other longterm" which presumably includes simulated humans, but an estimated 55 million dollar per year (or 13%) on farmed animal welfare (for those who are curious, the largest recipient is global health at 44%, but it's important to note that it seems like the more people are into EA the less they give to that compared to more longtermist causes). Farmed animals "don’t resent your condescension or complain that you are not politically correct, they don't need money, they don't bring cultural baggage..." yet that doesn't mean they aren't a worthy cause. This quote might serve as something members should keep in mind, but I don't think it works as an argument on its own.
A key difference is that animals exists here and now, and I think most humans would viscerally understand animal shouts of pain as requests for help/food/space etc..
The quote is less about the unborn, and more about the real and ignored needs of disenfranchised people.
Help your fellow humans first and foremost, (which I would argue is well served by treating animals well, for sanitary, eco-system, or even selfish mental well-being by not having our souls marred by brutality)
Actual beings with needs: humans, animals > the unborn >>>>>> unrealistic hypothetical humans.
This quote might serve as something members should keep in mind, but I don’t think it works as an argument on its own
Putting aside the idea of it being an argument, I think you gotta have a bit more self-esteem in your cause, mate. I’m no animal rights activist, but even I can see that animals are living beings that can be harmed, unlike the unborn or simulated. It’s absolutely a worthy cause.
I'm all for dismantling the meat industry but there is a lot of political confusion going around animal welfare circles, a lot of projecting ideals onto animals, and it's a very important thing to keep in mind. A lot of the movement is straight up reactionary.
less than 1%...on other long-term...which presumably includes simulated humans.
Oh it's way more than this. The linked stats are already way out of date, but even in 2019 you can see existential risk rapidly accelerating as a cause, and as you admit much moreso with the hardcore EA set.
As for what simulated humans have to do with existential risk, you have to look to their utility functions: they explicitly weigh the future pleasure of these now-hypothetical simulations as outweighing the suffering of any and all present or future flesh bags.
Do you have a source for this 'majority' claim? I tried searching for more up to date data but this less comprehensive 2020 data is even more skewed towards Global development (62%) and animal welfare (27.3%) with 18.2% for long term and AI charities (which is not equivalent to simulated humans, because it also includes climate change, nearterm AI problems, pandemics etc).
Utility of existential risk reduction is basically always based on population growth/ futuregenerations (aka humans) and not simulations. 'digital person' only has 25 posts on the EA forum (by comparison, global health and development has 2097 post). It seems unlikely to me that this is a majority belief.
Without wishing to be rude, this seems like a comically false equivalence. On an obvious count: farmed animals bring a lot of baggage. Nobody wants to go to a slaughterhouse, which would be the genuine equivalence here between dealing with a real, messy, argumentative human being, versus just eating the beef with the picture of the friendly cow on the packaging, i.e. advocating for a cost-benefit which favours people who don’t exist yet.