What if Humanity forgot how to make CPUs?
What if Humanity forgot how to make CPUs?
Pretty good Solarpunk prompt with some medium-hard sci-fi thrown in.
What if Humanity forgot how to make CPUs?
Pretty good Solarpunk prompt with some medium-hard sci-fi thrown in.
So much interesting stuff on there, holy moly
@dillekant That's a dystopian scenario, not solarpunk…
One of the reasons why a solarpunk way of thinking is so appealing to me is that it challenges me to think about what we could do to subvert a dystopian scenario and build something better. After all, climate change is going to cause tremendous upheaval, even if the world collectively stopped making things worse. It's a more humble way of thinking about a problem, because it isn't built on the idea that we can be masters of the world, but instead need to learn how to understand ourselves as intra-acting within ecosystems
Depends on if you believe that using biodegradable plastics on your chip packet is dystopia because ruffling it around sounds different.
At one point we had a long back and forth with my cousin, a post-apo fan, about the credibility of various scenarios, various shortage, various technological regressions. My conclusion: if humanity loses the ability and the knowledge to make CPUs, then CPUs are not the first thing you will miss.
It would have meant that a generation-long obscurantist crusade would have purposefully destroyed that knowledge.
I don't see anything natural nor a human-made disaster that could durably erase all knowledge and industries on a global scale. You would need an intelligence geared at destroying knowledge specifically.
What if we forgot capitalism and rewarding greedy self absorbed narcissists tho
That's why it's solarpunk.
I would just install Linux on crabs, play Doom.
If CPUs go, things could get dark...
My bad, thats if we lose STCs.
In case humanity forgets about how to make CPUs, some folks would still remember how to make stateful relays, magnetic core memory, punch cards and pneumatic logic valves.
It would take a bit of time from there back to a CPU.
I gotta admit my first thought while watching this was older CPUs, always had an affinity for the Motorola 68k family (see if you can guess my first computer).
But I also wonder how much we would be using those CPUs to provide control planes and interfaces for Industrial gear running 8051 variants. I imagine that a lot of those microcontrollers, which are already in use across a frightening amount of Industrial equipment will likely have enormously longer operational lifespans than the computers used to program them. Plus a cursory search shows I can buy hundreds of thousands of them without resorting to scavenging and repurposing.
Plus when suddenly everything in a DIL DIP package is seen as desirable again my soldering skills will only be seen as deficient not a heinous abomination.
I can't imagine hobbyist forgetting how to do lithography. But it's a lovely video. the electro migration stuff scares me
The "forgetting" isn't individuals forgetting, it's about institutional memory. Individually, there might be plenty of folks who can build chips, but they might live too far apart, or there's no money in it, or whatever other mechanism which causes things to be built and the technology to continue. There's a massive bootstrapping issue.
Dies the Fire, by S.M. Stirling.
As a geriatric millennial I'd be ok with this
The only CPU I can make is Cheese Pizza, YOU!
First, this is a great video and a thought provoking topic. While she touched on it in the last 45 seconds, I think the timeline at +10Y and +15Y would have more vacuum tube solutions for simple "computing" electronics. I also think that the ideal of social media would be too hard for modern society to abandon, and we'd revert to prior technologies for electronic social communication like "party line" telephones and HAM radio, neither of which require advanced semiconductors.
Analog broadcast television (also tube driven) would make a comeback and be well in place by +15y after so much of the spectrum was freed up from lack of digital devices consuming it. Larger use of shortwave radio would come back too.
Lastly, I'm embracing her premise as "the technology to create new semiconductors disappears". Perhaps this is because the world's single source of quartz crucibles in North Carolina needed for high end semiconductor manufacturing exhausts its supply. or some other reason. With that though, it doesn't mean progress forward would stop. She touched on part of this in the last 60 seconds with video talking about groups trying to rediscover or restart semiconductor manufacturing, but she didn't explore an alternative advanced computing medium: light. Optical Computing exists today as laboratory experiments and are far inferior to general purpose semiconductor based computing, but if semiconductors are off the table optical computing starts looking very attractive. Consider however, that this technology is the basis for most quantum computing implementations today. So we wouldn't be starting from scratch technologically.
One note on automobiles too: I think many modern cars would be retrofit with simplier electronics to keep them on the road. Gone would be the days of advanced ECUs yeilding high performance and fuel efficient operation, but I'm betting much more simplified ECUs could be made with single mode operations that would make the car operational for general (and inefficient use). Mechanical timing would come back into play, fuel injection would be replaced with carburation again, and coilpacks with mechanical distributors.
I think I'd be okay with my Commodore 64 as my primary computing with a bit more evolved connectivity with something like Fidonet for global email communications to be maintained.
Still with my minor notes, I really enjoyed her idea and glad she did the video.
How I see it would be a massive investment into what would be previously e-waste. Even super low end hardware can be used as servers if given a lightweight Linux installation. Just yesterday I took an old budget Android phone out of the closet and factory reset it, debloated it, and installed a web server on it just for fun. Obviously I have much faster machines so I don't need use it, but I could easily port forward my IP to it and host a website off of it if I needed to.
Forgot? Have you seen what's needed to make CPUs? Clean Room Manufacturing is a fragile thing.
Developing countries often need a lot of help just getting to ISO Class 7, which is what's needed to safely make cough syrup.
Injectable drugs are ISO Class 5. CPU manufacturing is ISO Class 1 and 2. In some post-apocalyptic scenario, depending on the scenario, it would be decades or generations of work to get semiconductor manufacturing back. Even if you have an abandoned factory sitting right there. It would potentially be decades to get back to making anything safely injectable. Supply chains involved with specific parts and inputs. shudder
On the other hand, we don't need nanoscale transistors to achieve most of the usefulness of CPUs. Most of that high-tech performance is wasted on things of questionable usefulness for society. The C64 CPU had an 8 micrometer process that likely does not require ISO class 1 or 2.
...you say as we both use machines that used a planet's worth of supply chains and resources to talk to each other.
No, it wouldn't. It'd take decades to get back to nano-scale modern CPUs, but not CPUs in general. The smaller the CPU the more it matters to be clean. If you could measure by atoms, then yeah, any stray atoms could matter. If you're measuring something that can be seen with the naked eye then you could probably do it in open air and not have an issue.
It's not just the manufacturing of that one thing that is under consideration. There's an entire supply chain that gets you to that point where you finally have the inputs needed to enter the lab and make the product. There's likewise a whole bunch of supply chain needed to get an ISO Class 5 clean room, which is what's needed for general microprocessors. Even if you're only talking about a clean work box on a bench top.
Who is mining the cobalt and aluminum and making the glass and plastic tools needed to stock the lab where you're making 1980's style microprocessors? Who is making a pure silicon ingot you'll slice to get a wafer? What will you use to slice the ingot for the wafer? How will you polish the wafer to microscopic levels of flatness? Who is making the oscilloscopes that test the processors to see if they work? Who is making the glass for the lenses for high-power microscopy you need to work? Where will you get the bulbs and needed for the photolithography stage? Where will you get the tiny tiny tiny wires that connect the pins to the chip? How will you purify and process refined silicon dioxide? Sure, the stuff is everywhere, but think through how you go from a piece of quartz on the ground to a material you need to layer on a wafer (where you gonna get the wafer??) and what machines and processes are needed for that. And on and on and on. One of those things missing means you can't move forward.
And depending on the scenario, each of those things needs to be local to you as well.
This is Carl Sagan "If you want to make an apple pie from scratch, first you need to invent the universe" level picking the process apart. Everything is connected, and we don't always appreciate how much things are inextricably tied to what we use on a daily basis.
My favorite example: This guy figured it out when thinking through a cheeseburger.
There's also a book from 2016 called "When the Trucks Stop Running" that is fearmongering oil industry hype, all about how important oil is to fueling heavy machinery. (Spoiler, it's not as important as they make it out to be) But the real lesson of the book is how many rarely seen or talked about corners of the supply chain are fundamental to keeping huge numbers of industries running, and how fragile many advanced technologies are to supply chain interruption.
Actually, using intel as an example, any CPU that is not a I9 is a failed CPU. The I3, I5 and I7 are I9 with complete sections of the prossessor fucked up, doing CPUs are difficult also in perfect conditions.
It may not be the most competitive fab on the market, but Hacker Fab is already within reach of a well-funded makerspace. I haven't read up on their cleanroom procedures, but I assume all the costs of that are included in their breakdown.
It doesn't necessarily need to be competitive with TSMC sub-3nm. CPUs at 1000nm can still be incredibly useful.
Yeah, but look at how much other stuff that is highly-manufactured goes into that process? It's not about just the end product, it's about the entire supply chain needed to get the stuff to make the 1 product.