Also worth pointing out:
There are other issues too. All of Law and Blair’s tests were done with one kind of 3D printer—a Prusa MK4S. There’s hundreds of different devices on the market that all behave differently. Law also pointed out that brass nozzles warp over time themselves and may produce different results after hundreds of prints and that different nozzles made from different materials may work very differently. Law would also want an examiner rate study—a formal scientific inquiry into false positives and examiner bias.
Yeah I don't like this either. So many chances for a mistake, be in the wrong dir, file misspelled, something not cloned correctly or anything else not setup as you think it might be and suddenly the package manage does something you don't expect (like try to install globally rather then in a project or vice versa).
For a lot of things I would rather have something web based than app based. I hate having to download some random app from some random company just to interact with something one time. Why do all restaurants, car parking places etc require apps rather than just having a simple site. Not everything should be native first IMO.
I wouldn't debian stable is no better than any other mainstream distro. Worst in some regards as it tends to have really old versions of things. If that sounds nice to you then go for it. But it is not the default choice or recommendation for most people.
There is also little reason not to try out different ones to compare if you want to. Nice to see what they are like for yourself if you have the time to.
I am not afraid of some tech journey, but even though arch seems the coolest, with Wayland, kde, hyperland customization, i am not confident enough to use it for work.
The only way you will gain confidence in it is to try it out. But also, most distros use wayland these days and it is more up to the desktop environment you use rather than the distro you use. hyperland is a wayland compositor and is in the repos of most if not all major distros. You should be able to install it on anything really. You can replace the desktop environment or install multiple ones side by side if you want to on just about any distro. The biggest difference between them is which ones they come with by default. But really if you are looking for a highly customized experience then Arch tends to be the way to do as you have less extra fluff you have to remove or work around when getting the system exactly as you want it. The hardest part of Arch is installing it the first time. Really after that it is not any harder to use or maintain. IMO it is easier to maintain as you have a much better understanding of how you set up your system as you are the one that set it up to start with.
I heard it can completely crash your system if your a noob.
You can break any distro if you mess with things. The only big difference is Arch encourages/requires more messing around at the start then other distros. And IMO is easier to fix if you do mess things up - you can always just boot a live USB and reinstall broken packages or reconfigure things without needing a full reinstall again. You can basically follow the install guides again for the bits that are broken to fix just about anything. And that is only if you break something critical in booting. In my early days I broke (requiring a full reinstall) way more ubuntu installs then I have ever broken my Arch ones later on. It is really just about how much you want to tinker with things and how much you know about what you are tinkering with as to if they will break or not rather then what base distro you use.
And you can always try the install process and play around with different distros in a VM to get a feel for them and learn what they are like. So don't be afraid to try out various different ones and find the one you like the most. Your choice is never set in stone either. Just ensure you have good backups of everything you care about and the worst that will happen is you need to reinstall and restore your backups every once in a while.
but my main needs are not really discussed
...
So in essence i need something stable that is relatively easy to use and has great ue5 and gaming perf.
That is probably the most common set of requirements people ask for. In reality, with a few exceptions, there is really not that much difference between distros given those requirements. UE5 is newer so the biggest change there would be that you might find distros that ship newer versions of stuff might run it slightly better then distros that ship older software. In practice I think it has been out for long enough that you wont see much difference unless you want to play something new on the day of release (but these days those are all buggy messes anyway... not sure your choice of distro will make as big a difference as waiting a few weeks/months for the initial patches to rollout).
Remember, all distros are essentially based off the same software, the biggest difference being what desktop environment they ship with and what versions of software there ship (and how how long they stay on that version). By far the biggest difference you will see if what desktop environment they use and all distros essentially package the same set of desktop environments - each might come with different ones by default but they typically contain all the popular ones in their repos.
i need something stable... great gaming perf
In particular these two points. Do you know what you are asking for here? These are the most bland and wishy washy requirements. Everyone wants something stable and fast, never seen anyone ask for something that crashes all the time and is slow. But worst these tend to be on the opposite ends of the spectrum in terms of requirements, if you optimize for one you tend to trade off the other.
Even stability has multiple meanings. In terms of crash stability you will find all distros to be about the same. No one distro wants to ship buggy crashy software. But at times they do. And it is really just the luck of the draw as to when this might happen to you based on what software you use, how you configure your system and what hardware you have. Some combinations just don't work for some weird reason and you wont know until you hit it. This is why you hear some people claim one distro is a buggy mess while some other one is rock solid while someone else argues the exact opposite. All main stream distros are just as good as any other in terms of this and you are just unlucky if you ever do run into that type of issue. The biggest problems in this regard tends to be when a new major version of something comes out - but like with gaming it can be beneficial to wait a few months for any issues to be patched before jumping to the latest big distro version.
The other side of stability is API stability - or the lack of things changing over time as new versions of stuff get released. There are two main types of distros in this regard, point release distros which freeze major versions of packages between their major releases so you wont get any new features during the release cycle that version of the distro. Then you have to deal with all the breaking changes from newer versions of software once every so often when a new distro version comes out. Vs rolling release distros that upgrade major versions constantly and so generally follow a lot closer to the latest versions of things than point release distros. Really the big trade off here is not if you encounter breaking changes.
Any distro will need to deal with them at some point, the choice is how often you deal with them. You can wait years on the same version of a point release distro and only need to deal with all the breaking changes once every few years, or once every 6 months. Or you can deal with things as they come out with a rolling release distro. But while it might sound nice to only deal with it every few years it also means you need to deal with all the changes at once. Which can be much more disruptive when you do decide to. Quite often I find the slower upgrading distros are better off with just a full reinstall on the latest version than upgrading from one to the next. Personally I prefer dealing with small things frequently as they tend to be easier to fix and less disruptive over longer periods of time. When I was running kubuntu I used to end up reinstalling it ever 6 months as the upgrades never worked for me (though this was a long time ago), but my oldest arch install lasted probably probably 5-10 years or so.
And at the same time how frequently you get the latest versions of things means you get any performance optimizations and support for newer hardware or newer games as well. But also any bugs or regressions. It is a double edged sword. Which is why stability and performance tend to be a leaver you can tune between rather than two separate things to can achieve. Just like overclocking, the more performance you can get out of a system tends to result in the system becoming less stable overall. Everyone wants the most stable and fastest system, but in reality everyone has a different limit on how much or what types of stability they are willing to give up on to achieve different levels of performance.
But out the box, you will find most distros to be very much within a couple of % of each other and which is fastest will vary depending on which games you want to play and what hardware you have. But they all tend to have quite a bit of head room to optimizes for specific use cases as they all are optimizing for the general use case which is typically just trading off performance in one thing for another. But again, we are talking about tiny difference overall.
If the package is popular then it is very likely already packaged by your distro. You should always go there first if you care that much. If the package is not popular enough to be packaged by a distro then how does another centralized approach help? Either it is fully curated like a distro package list and likely also wont contain some random small project, or it is open for anyone to upload scripts to so will become vulnerable to malicious scripts. Worst yet people would be able to upload scripts to projects they don't control as the developers of said project likely wont.
Basically it is not really any safer then separate dev owned websites if open nor offer better package support then distro repos if curated.
Maybe the server was hacked and the script was changed?
Same thing can happen to any system though. What happens if your servers for this service are hacked? Being a central point makes you a bigger target and with more people able to change (assuming you are not going to be the only one to curate packages) things you have a bigger area of attack. And once hacked they can compromise far more downloads than a single package.
Your solution does not improve security - just shuffles it around a bit. Sounds nice on paper but when you look at it in more details there are a lot more things you need to consider to create an actually secure system that is better then what we currently have.
Then how would you trust these scripts in a central repo? Seems to add no real value or safety over dev managed scripts if you are not willing to go down the path of becoming yet another distro packaging system.
There is also no way to verify that the software that is being installed is not going to do anything bad. If you trust the software then why not trust the installation scripts by the same authors? What would a third party location bring to improve security?
And generally what you are describing is a software repo, you know the one that comes with your distro.
Cannot remember if the study was stupid or if peoples interpretations of it where. But when covered up else where you will lose a lot of heat through your head. More so then if just an arm or just a leg was exposed as with your arms and legs your body will slow down blood flow through them to try and converse your core temperature - it cannot do that with your head.
once a developer enacts an end of life plan, their legal culpability is removed What legal culpability? If you are not hosting anything then you wont be liable for anything. It is not like if you create a painting and someone defaces it with something that you become liable for that... That would be insane.
Random programming certificates are generally worthless. The course to get them might teach you a lot and be worth while, but the certificate at the end is worthless. If it is free then it does not matter too much either way, might be a good way to test yourself. But I would not rely on it to get you a job at all. For that you need other ways to prove you can do the job - typically with the ability to talk about stuff and having written some real world like application. Which a course might help you do to.
The only things not linked to cancer are the things not yet been studied. Seems like everything at some point has been linked to cancer.
The data showed that people who ate as little as one hot dog a day when it comes to processed meats had an 11% greater risk of type 2 diabetes and a 7% increased risk of colorectal cancer than those who didn’t eat any. And drinking the equivalent of about a 12-ounce soda per day was associated with an 8% increase in type 2 diabetes risk and a 2% increased risk of ischemic heart disease.
Sounds like a correlation... someone who eats one hot dog and drinks one soda per day is probably doing a lot of unhealthy things.
It’s also important to note that the studies included in the analysis were observational, meaning that the data can only show an association between eating habits and disease –– not prove that what people ate caused the disease.
Yup, that is what it is. A correlation. So overall not really worth the effort involved IMO. Not eating any processed meats at all is not likely a big issue, but your overall diet and amount of exercise/lifestyle. I would highly suspect that even if you did eat one hotdog per day, but had a otherwise perfect diet for the rest of the day and did plenty of exercise, got good sleep and all the other things we know are good for you then these negative effects would likely becomes negligible. But who the hell is going to do that? That's the problem with these observational studies - you cannot really tease out the effect of one thing out of a whole bad lifestyle.
I hate headlines like this as it makes it sounds like you can just do thins one simple thing and get massive beneficial effects. You cannot. You need to change a whole bunch of things to see the types of reduction in risk they always talk about. Instead they always make it sounds like if you have even one hot dog YOU ARE GOING TO DIE.
The indicator being stuck is a recently fixed issue:
Fixed a case where the battery level indicator could become stuck
https://store.steampowered.com/news/app/1675200?emclan=103582791470414830&emgid=529850584204838038
YAML is not a good format for this. But any line based or steamable format would be good enough for log data like this. Really easy to parse with any language or even directly with shell scripts. No need to even know SQL, any text processing would work fine.
CSV would be fine. The big problem with the data as presented is it is a YAML list, so needs the whole file to be read into memory and decoded before you get and values out of it. Any line based encoding would be vastly better and allow line based processing to be done. CSV, json objects encoded into a single line, some other streaming binary format. Does not make much difference overall as long as it is line based or at least streamable.
Never said it had to be a text file. There are many binary serialization formats that could be used. But is a lot of situations the overhead you save is not worth the debugging effort of working with binary data. For something like this that is likely not going to be more then a GB or so, probably much less it really does not matter that much if you use binary or text formats. This is an export format that will likely just have one batch processing layer on. This type of thing is generally easiest for more people to work with in a plain text format. If you really need efficient querying of the data then it is trivial and quick to load it into a DB of your choice rather then being stuck with sqlite.
export tracking data to analyze later on
That is essentially log data or essentially equivalent. Log data does not have to be human readable, it is just a series of events that happen over time. Most log data, even what you would think of as traditional messages from a program, is not parsed by humans manually but analyzed by code later on. It is really not that hard to slow to process log data line by line. I have done this with TB of data before which does require a lot more effort to do. A simple file like this would take seconds to process at most, even if you were not very efficient about it. I also never said it needed to be stored as text, just a simple file is enough - no need for a full database. That file could be binary if you really need it to be but text serialization would also be good enough. Most of the web world is processed via text serialization.
The biggest problem with yaml like in OP is the need to decode the whole file at once since it is a single list. Line by line processing would be a lot easier to work with. But even then if it is only a few 100 MBs loading it all in memory once and analyzing it all in memory would not take long at all - it just does not scale very well.