Intel’s Q1 2025 earnings press release talked up their new AI-enabled chips. But these are not selling. [Intel] In the earnings call, CFO Dave Zinsner mentioned they had “capacity constraints in In…
I thought that was pretty obvious by now? Based on how much the companies are trying force feed people their latest version through constant notifications about assistants, assisted search, etc.
It's one of the greatest flaws of relying on social media for market research: Tech-bros being overly loud about things like AI, NFTs, etc. trick companies into thinking more people are interested.
Now they've invested tons of money and people aren't biting, so they're constantly nagging people to engage so they can justify their expenditure.
Facebook still tries to push their metaverse (including shit like branded virtual clothing??) on people using the quest. It's in your face when you boot up the device and it ""recommends"" me worlds to join from time to time. Brother I just want to play beat saber
AI basically takes what I already see in search results and tries to make it a conversational summary. I don’t want a conversation or to read a made-up wiki summary, just give me the correct and pertinent result. Problem is that they put AI first and search result quality has been deteriorating for years, so two wrongs don’t mean forcing it on users is right.
I'm in the same boat. Besides some image generators my kid and I used to create some avatars, I don't get it. Don't need a conversation, just give me search results. Don't waste my time.
The hype around this shit is astounding. That people who make decisions about products from huge brands keep buying in is shocking to me. How can something so useless (to most people) capture the imagination of educated and intelligent people? It's a sign of how broken capitalism is. Rational thought is replaced by fear of missing out.
Not educated and intelligent people, wealthy investors.
We're sighing at having to build all these features the boss wants and we know they are stupid and we see the lost oppertunity cost that could have been used to improve other things instead. I'm tired and the job market is so ass right now.
Most of the concrete value that can be delivered by connecting things to the Internet and simple algorithms has been extracted by silicon valley. The only capital extraction mechanisms left are difficult things that only a government has the capital access to make real progress on (e.g., AGI, advanced robotics, self-driving vehicles, space exploration, etc.) and hyped up garbage that big investment firms think they can extract value out of either the public (through scams like cryptocurrency) or other investors (through LLMs and AI hype) and sell before people figure out that it's smoke and mirrors.
We made real progress on the backs of mostly government-funded research projects like DARPA and GPS. The industry was able to optimize and innovate the shit out of the earliest computing breakthroughs where now you have a device in your pocket that can hold several libraries of Congress and beats anything put out in desktop form 20 years ago. But since the tech companies that matter are all giant, there just aren't ways for them to grow market share (everyone's their customer) or get many more dollars out of their existing customers. All that is left are scams and bad business practices. That's why we're in the golden age of enshittification.
From what I understand, this is the trend because Apple Silicon works. It has well integrated GPU with CPU with great memory for AI tasks on a minimal case. You can run DeepSeek (the 671B one) on it. Who wouldn't want that? The problem is that those companies hardware, specifically the firmware, is not to be trusted.
Imagine a world where you would have to jailbreak everything on your PC for it to work. I think that's what they're going for. AI is really useful, and if they can make something like Mac Studio cheaper, it has obvious value.
This isnt really a good thing. It means that consumers prefer to use cloud AI (which is a privacy nightmare) compared to running a local LLM, which is more privacy preserving
That's good. They shouldn't care. I'll keep saying this until the cows come home: AI is not something that can be used responsibly by most people. As the technology currently exists, it has rare and specific use cases -- anything where you can accept a high failure rate, or can verify an answer more easily than you can posit one. This is completely against user expectation, and when the market realizes this, the bubble could pop.