This is a thing people miss. "Oh it can generate repetitive code."
OK, now who's going to maintain those thousands of lines of repetitive unit tests, let alone check them for correctness? Certainly not the developer who was too lazy to write their own tests and to think about how to refactor or abstract things to avoid the repetition.
If someone's response to a repetitive task is copy-pasting poorly-written code over and over we call them a bad engineer. If they use an AI to do the copy-paste for them that's supposed to be better somehow?
We often used to get video games for Christmas, but wouldn't be allowed to play them straight away because we were spending quality time with family etc. Then we'd get up early on the 26th and pack the car to go on a 2 week camping trip, still not having played our precious new game.
We would take the instruction manual with us on the trip and spend those 2 weeks intensely studying the controls, the lore, everything. By the time we returned home we were fucking ready.
Yes that's true. There may also be issues just with getting money out of the country to make the required payments to the storage providers. Either due to local restrictions or international sanctions.
If this is a real problem you have, and not just a thought experiment, I think rather than burying the data on some unreliable medium, your best bet is to just pay someone to store it for you offshore, away from the dictatorship you mentioned.
There are plenty of consumer-grade cloud storage services. I'm sure there are more niche ones specifically for long-term archival as well, which would usually be cheaper per bit, per-year, if you don't need to access the data regularly.
Gigs. Either just buying tickets to random local venues. Our go see your favourite artists live, but make sure you get there early enough to see the openers.
I've discovered so many amazing bands because they opened for bands I already knew I liked.
If you can't physically get to gigs then you can even just look up who your favourite artists are touring with, that will give you a pretty good sense of them being similar.
They're all groups of people with some kind of shared purpose or values. Cults are harmful and power based. Communities are helpful and consent-based. Religions can fall either way, or somewhere in the middle.
Well the problem is that "bug" is not a scientific term right? Or even if it is, colloquially I think it could easily refer to either insects specifically or arthropods more generally.
Certainly a lot of people refer to spiders as bugs despite them not being insects.
There is probably little practical help you can give, but don't underestimate the importance and impact of social and emotional support.
When someone is in such a shitty situation, just knowing that there are people who care makes a huge difference. So just be a good friend. Listen and empathise when they need to talk about shit. Give them a laugh when they need cheering up or distraction from the bullshit.
There's more to it than that. Firstly, at a theoretical level you dealing with the concepts of entropy and information density. A given file has a certain level of information in it. Compressing it is sort of like distilling the file down to its purest form. Once you reached that point, there's nothing left to "boil away" without losing information.
Secondly, from a more practical point of view, compression algorithms are designed to work nicely with "normal" real world data. For example as a programmer you might notice that your data often contains repeated digits. So say you have this data: "11188885555555". That's easy to compress by describing the runs. There are three 1s, four 8s, and seven 5s. So we can compress it to this: "314875". This is called "Run Length Encoding" and it just compressed our data by more than half!
But look what happens if we try to apply the same compression to our already compressed data. There are no repeated digits, there's just one 3, then one 1, and so on: "131114181715". It doubled the size of our data, almost back to the original size.
This is a contrived example but it illustrates the point. If you apply an algorithm to data that it wasn't designed for, it will perform badly.
What about the public service? I don't know about where you live, but in my country the public service doesn't care what degree you have, just that you have one. Look into the graduate programs of your local/state/federal governments.
The engineering department at my uni had a tensile strength testing machine which says "Made in the GDR" on it, a country that hasn't existed for 40+ years.
This is a thing people miss. "Oh it can generate repetitive code."
OK, now who's going to maintain those thousands of lines of repetitive unit tests, let alone check them for correctness? Certainly not the developer who was too lazy to write their own tests and to think about how to refactor or abstract things to avoid the repetition.
If someone's response to a repetitive task is copy-pasting poorly-written code over and over we call them a bad engineer. If they use an AI to do the copy-paste for them that's supposed to be better somehow?