Translation: Traitor of the fatherland.
Aceticon @ Aceticon @lemmy.dbzer0.com Posts 1Comments 2,310Joined 1 yr. ago
Shit peddler wants people to stop calling what he peddles "shit".
Oh yeah, but the fact that it's on an official NS information board on a train station - the kind of place were you usually just get serious announcements in somewhat formal language - and actually informing travellers of real problems with the train they're waiting for, gives it quite a lot more punch than it merelly being overboard baby talk.
Think about it this way: it's like a normally very serious customer support person suddenly decides to inform customers of a problem with the product and does so using the most ridiculous baby talk.
The language is funny and then the context elevates it to hilarious.
"Moving goalposts" is asking you to back your claims versus the point I was making in the posts you replied to (which include the whole "in production" part)?!
Sure, mate, salve your ego after having blindly dived into deeper waters than you expected.
Can you give me an example of a microcontroller that can run Linux in production (so, not just be made to run Linux experimentally, but which actually is used in applications were it's worth it to run Linux on it) AND which costs less than $2 in bulk?
Because the only ones I can think of, the lower end of the Esp32 range, can run Linux but it's just not worth it for the kind of applications they end up in.
This is pretty much a "all Tech companies have to jump on the AI hype train" pressure on publicly traded companies and those who need lots of investor money, and little if at all customer pressure.
All investors want their money to be in the same place as those who invested in Google before it made it big, and the AI hype promises exactly that to the "winners" of the AI race.
Customer needs and demands are well below secondary to investor pressure, especially for companies which have dominant market positions (so general customers have no decent alternatives) and startups whose entire business model is AI.
God forbid people want the compute they are paying for to actually do what they want, and not work at cross purposes for the company and its various data sales clients.
I think that way of thinking is still pretty niche.
Hope it's becoming more widespread, but in my experience most people don't actually concern themselves with "my device does some stuff in the background that goes beyond what I want it for" - in their ignorance of Technology, they just assume it's something that's necessary.
I think were people have problems is mainly at the level of "this device is slower at doing what I want it to do than the older one" (for example, because AI makes it slower), "this device costs more than the other one without doing what I want it to do any better" (for example, they're unwilling to pay more for the AI functionality) or "this device does what I want it to do worse than before/that-one" (for example, AI is forced on users, actually making the experience of using that device worse, such as with Windows 11).
In my experience, pretty much everything does indeed have a microcontroller nowadays (because software is so much easier to make, test and change than electronics, plus a lot of what took several digital components in the old days are just a few lines of code in a microcontroller, so putting a microcontroller there is cheaper also in hardware terms), but most things use tiny ones like the one whose info I linked in my previous post (if you get your hands on a broken mouse, open it up and google the part codes of the integrated circuits you find inside), with maybe a bit more memory since that specific ATTiny is very much on the lower end.
The reason why it's something like that rather than something bigger, is that something like that costs $0.38 each, in bulk, whilst something that's big enough to run Linux costs around $10+, plus something like that has a smaller physical footprint, needs all of 1 capacitor and 1 resistor externally for it to work and can work directly from some pretty shit power sources, further reducing overall costs, whilst microprocessors demand a lot more supporting electronics and proper power regulation to work.
When the cost of putting a big one there are nothing next to the overall device costs, manufacturers can easilly just upgrade it to "smart" enough to justify Linux (which goes beyond merelly digital controllers and displays or even WiFi connectivity and into Android TV levels of complexity) with little risk, but when we are talking about much cheaper stuff, they'll only make a "smart" version if there's explicity demand for such "smarts" that's willing to play a lot more. Notice how, when comes to dildos if you look at the Lovense which probably the most complex around, as somebody else pointed out it still uses a microcontroller that's not running Linux and is way too small for having Linux to be worth it even if possible - in fact per its datasheet doesn't even support WiFi but only Bluetooth LE, which means it doesn't even have an IP stack, as Bluetooth is basically Serial-over-radio, which means it's actually the software on the other side of that Bluetooth link that does the heavy lifting.
I think your idea of "everything" in Electronics is pretty much just big ticket Consumer Electronics, yet most Electronics out there by number of units is actually the small stuff that costs a few dollars in parts to make, and almost all dildos fall into the second category, not the first.
In all fairness that message is seriously taking the piss and is hilarious.
Code made up of severally parts with inconsistently styles of coding and design is going to FUCK YOU UP in the middle and long terms unless you never again have to touch that code.
It's only faster if you're doing small enough projects that an LLM can generate the whole thing in one go (so, almost certainly, not working as professional at a level beyond junior) and it's something you will never have to maintain (i.e. prototyping).
Using an LLM is like giving the work to a large group of junior developers were each time you give them work it's a random one that picks up the task and you can't actually teach them: even when it works, what you get is riddled with bad practices and design errors that are not even consistently the same between tasks so when you piece the software together it's from the very start the kind of spaghetti mess you see in a project with lots of years in production which has been maintained by lots of different people who didn't even try to follow each others coding style plus since you can't teach them stuff like coding standards or design for extendability, it will always be just as fucked up as day one.
It's double funny because it's pretty much the opposite of what it was meant.
Generally the more money that depends on their systems being functional without errors or interruptions, the more an industry is willing to pay for devs.
However in addition to that there is also the supply-demand effect: in demand specialists for which there are few available experts get paid more than people doing the kind of work for which there are a lot more experiences professionals around.
3D graphics programmers would benefit from the second effect but generally not from the first.
As a comparison, for example Quants (who program complex mathematical models used in asset valuation software for complex assets such as derivatives) in Investment Banking in London - thus who gain from both effects - about a decade ago had salaries of around £300k per year as they're both working on critical software elements in systems used for managing billions of dollars of assets and have a very rare expertise (they're usually people with Mathematics or Physics Masters or Doctorates who are also developers and who also have quite a lot of specific knowledge of the business of investment banking, which all adds up to a very rare combination of skillsets)
It's not by chance that for example the Investment Banking industry pays a lot more money to developers than the wider IT industry - a system breaking down for an hour or two there can cost millions because, for example, trader's can't actually trade certain assets.
Generally the more money that depends on their systems being functional without errors or interruptions, the more an industry is willing to pay for devs.
If your income comes mainly from your work, you're Working Class (even if you own you own business), if your income comes mainly from the money made by the money you have (in assets or even "investments") you're Owner Class.
Certainly, modern politics only ever divides people in those two classes, with mainstream parties generaly only working for the good of the Owner Class which is how you end up with falling salaries in real terms and growing Asset valuations in the form of bubbles on all kinds of assets, most notably stocks and realestate (notice how most mainstream politicians see the rising of both stockmarkets and house prices - tough of late, they don't say it about the latter quite as openly - as being good things).
The single greatest scam of modern Neoliberal Capitalism was making people who own their means of production - sometimes only partially or not really because they're in debt for it - but still have to work for a living think they're not Working Class and hence Neoliberal Capitalism is actually working for their benefit.
If there is one thing that around a decade working for the Finance Industry has taught me, is that almost all government policies are directed to help those who make money from having money make even more money, which is why, for example, plenty of countries have lower taxes on income from "investments" than on income from "work", when the fair thing would be the other way around since the former is parasitical so lower taxes on it just induces more economic actors to engage in non-productive, extractive economic activities.
That shit is now fullly at the Gestapo level.
The only experience I have with those (I'm an EE but don't practice it professionally so have only have done embedded circuits for fun) is things like the ESP8266 and ESP32 and those come with an RTOS, though at least the latter should be powerful enough to fit Linux. You can actually program for those in C on top of Arduino or on top of the manufacturer libraries which are a bit more high level than for simpler microcontrollers without WiFi support as they come with an IP stack rather than only exposing low level hardware functionality.
(It is actually a lot of fun making a proper web application on one of those to allow remote control of some hardware from an Android app - if you make it a REST interface - or even from a browser - if you make it serve web pages. I believe most "controllable from your mobile phone" lights out there use one of these microcontrollers or similar on the lamp side).
The Nordic nRF51822 somebody else mentioned doesn't even have WiFi support, only Bluetooth LE and has less memory.
Yeah, Linux is till amazingly modular and it's a fun advanced hobbyist project to just get it to run in the smallest thing possible ("Linux in a potato").
It's not however logical to increase hardware requirements (and hence costs) in the design of a production device just to have Linux there when all you're running is a single task in a single thread to control a motor, something you can otherwise probably fit in something like this.
You can force it to happen (because the Linux kernel is quite modular and you can make it way smaller by switching off a lot of things) but as I said, what's the point when all you need is to literally run a single application that only toggles I/O ports or does a bit of comms with an external integrated circuit via I2C, Serial or SPI?
In a production design for something as simple as a dildo - so basically a motor controller - there really is no point in paying for a more powerful and more expensive microcontroller, adding cost to the final product, just to stuff a Linux kernel there to run a single application that doesn't need things like filesystem or networking support.
(For example, here is an example of about the most low end microcontroller you can use. Notice the 128 bytes of RAM and 2KB Flash storage. You can stuff enough code in there to activate a motor in one of several specific simple cycles depending on the position of a switch but not much more. Of course that kind of stuff is programmed in C either directly on top of the hardware - by literally changing microcontroller registers - or most likely on top of a manufacturer specific low level library)
Further if you do need functionalities of an OS such as multiple task/application support, there are alternatives that tend to be better for those kinds of use, such as various RTOSes.
Now, as many pointed out, there's plenty of reason to do it for the challenge or the fun of it - the ethos of any good hacker in the traditional sense of the word - just not for a production design of a device for which tens/hundreds of thousands of units will be made where adding Linux raises the needed capabilities of the microcontroller and hence the hardware cost while not actually helping to deliver the needed functionality.
Not even the more expensive microcontrollers run Linux.
The kind of thing you would see in such a simple device like a dildo are the cheaper smaller ones with RAM sizes and Flash Memory sizes in the KB range and costs less that $1 in bulk.
Mind you, you can squeeze a Linux kernel into a really small amount of memory, but why for a production device pay more for a larger than needed microcontroller and then use most of its storage for a Linux kernel leaving little space for the actual functional code when you don't need support for things like filesystems or networking?
Microcontrollers are a whole different world from microprocessors.
I don't just mean that person, I mean anybody who would believe that "I'm a God fearing person" is actually the right thing to say.
I mean, if one thinks about it, it's curious that anybody would openly states that their relation to one's God is "Fear" rather than, say "Love", even if they are being entirely truthful in that.
"God loving" sounds a lot more respectful of a Deity, IMHO.
Further, even for that specific person my point still fits in a twisted way: that person chose that specific expression because deep down they believe people only restrain their bad actions because of fearing retribution.
It's a variant of Psychological Projection and thus still betrays that person's moral makeup: even as non-believers they use the "Fear of God" mantra because they themselves think that claiming that one has one's behavior under constant oversight from a deity which will punish bad behavior is the best way to convince others that one behaves in a good way, and they think so because they themselves restrict their negative behavior towards others through fear of external reprisal rather than through internal mechanisms such as guilt or a personal moral.
Even when people lie, you can still tell a lot about somebody by which lies they think will work best at convincing others.
Look, mate, it's simple: if you break it you won't be able to properly emulate the spaghetti kiss scene from The Lady And The Tramp because it will be too short.
So keep it in mind if you're a dog and you want to romance a bitch.