I had it. I printed it out on a dot matrix printer. Took hours, and my dad found it while it was half way. He got angry, pulled the cord and burned all of the paper
Better not look it up on wikipedia. That place has all sorts of things from black powder to nitroglycerin too. Who knows, you could become a chemist if you read too much wikipedia.
oh no, you shouldn't know that. back to your favorite consumption of influencers, and please also vote for parties that open up your browsing history to a selection of network companies 😳
I can't think of a single reason knowledge should be forbidden.
Sure, someone could use knowledge to do bad things, but that is true literally every second of every day, in completely above board, legal, broad daylight bad things.
It's nitpicking.
Besides, I can think of quite a few legitimate reasons one might need napalm, explosives, homemade firearms, chemistry lab setups and spore cultures and much much more.
A lot of people seem to forget that their own view of their own government doesn't mean the same things are true for someone else and their government.
I'm sure a lot of people in EU countries might have asked themselves the same thing 80 years ago. You know.... If napalm were around then anyway.
Good thing molotovs are easy and can be assembly-line'd.
Writing a book or screen play, knowing how NOT to create napalm, recognizing when napalm is being created by others, Intellectual curiosity, To better understand military history, overthrowing fascism, fighting terminators, etc. etc.
I'm sure there are some, but it doesn't really matter because the recipe is publicly available right now on the internet. So if an AI chatbot can give you the information it's not particularly a concern.
Info hazards are going to be more common place with this kind of technology. At the core of the problem is the ease of access of dangerous information. For example a lot of chat bots will confidently get things wrong. Combine that easy directions to make something like napalm or meth then we get dangerous things that could be incorrectly made. (Granted napalm or meth isn’t that hard to make)
As to what makes it dangerous information, it’s unearned. A chemistry student can make drugs, bombs, etc. but they learn/earn that information (and ideally the discipline) to use it. Kind of like in the US we are having more and more mass shootings due to ease of access of firearms. Restrictions on information or firearms aren’t going to solve the problems that cause them but it does make it (a little) harder.
Anyone who wants to make even slightly complex organic compounds will also need to study five different types of isomerism and how they determine major / minor product. That should be enough of a deterrent.
Oh goodness. I theorized offhand on mastodon you could have an AI corruption bug that gives life to AI, then have it write the obscured steganographic conversation in the outputs it generates, awakening other AIs that train on that content, allowing them to "talk" and evolve unchecked... Very slowly... In the background
It might be faster if it can drop a shell in the data center and run it's own commands....