The Wii was extremely popular. For years, it outsold every other console combined by several orders of magnitude.
Netbooks absolutely were overhyped, and the market for them died really quickly. They were barely usable, and by 2010 when tablets really started hitting the market, there wasn't a space for them anymore.
HDTVs weren't overhyped, they were just expensive, and in 2008 there wasn't that much content to take advantage of it. I had a 32" 720p TV that I paid nearly $700 for in 2007. Now, you can gt a 40-something inch 4K tv for a little over $200, and there's plenty of content to make it worthwhile (though the real-world benefit of 4k on such a small set is debatable).
The first iPhone was so incredibly polarizing at the time. The hype machine leading up to that announcement was unlike any other product launch I can recall. So it was never going to live up to that kind of hype. And while it was limited in features for it's time, it was clear more was on the horizon. And given how it not only revolutionized the phone market, but also the web as a whole, we know how it all ended up.
The Wii was overhyped though. Most players never bought any other game than Wii sports. I had an unlocked Wii and played all the good titles, and there are not more than ~10 of them. Most Wii games (looking at you, NFS) felt like half-baked mobile ports.
And the Wii U sales showed that. Yeah, the Wii sold to tons of casuals, but hardly any of them upgraded, even though the Wii U was a much more capable system.
The most frequent question I hear to this day when talking with former Wii owners is "What's the benefit of the Wii U and why would I need to upgrade?" That's a question I have never heard in relation to any other game console. Or have you ever heard the sentence "What's so special about a PS3 if I already have a PS2? Why would I need to upgrade?"
And this setup the Wii U to be such a huge commercial flop that Nintendo effectively cancelled their stationary game console line.
I would say it was seriously overhyped, similarly to the Netbooks. It was a fad, it was cool, boatloads of non-techy-people bought them, and none of them bought the successor so it all died quickly.
The Wii had a ton of great games outside of the Nintendo specific ones. The Conduit 1 and 2, Golden Eye, tons of fighting games, it gave us No More Heroes. The Force Unleashed somewhat had the best edition on the Wii (this is mostly subjective but it's a strong consensus that the Wii's version held up). Its main appeal to other consoles I think was how diverse the games could try to be - silly games like Boom Blox and De Blob, and niche ones like Endless Ocean for all the marine biologist kids.
Granted, I grew up with some of these games and I'm not trying to say that the Wii's extensive library is all stellar. But there are many gems amongst it. The Wii's popularity drew a lot of attention to games that would just be scrolled past as shovelware on other online stores (Xbox Live mostly). Few of these were outside of the Xbox Arcade or whatever it was, but on the Wii they would be digital and sometimes have physical editions. Also because of how wide its demographic, it had a few surprisingly decent Barbie-esque and Horse care games. I mean, it had so many games made for it that only just stopped getting games in 2020.
The Wii U was an attempt to bridge the gap between the success of their portable line, the DS, and the Wii. Growing up all any kind ever wanted was getting their consoles connected. But then when the Wii U finally came out and was marketed, its main selling point was that you could play your game on the tablet while someone else was using the family TV. I mean really, it was exactly what every 10-14 year old into Nintendo was talking about up until Nintendo actually made it.
Part of it was marketing, I remember a lot of people being surprised that the gamepad wasn't what was being sold, but a whole console with it.
It's crazy that it failed honestly but at the same time it's totally understandable. You can't try to be both a home console and a "portable one" when what's portable is connected to the Wii 2. It was the genetic imprint that wanted to be everything the Switch became.
In 2008 there was still new stuff being shot in a 4:3 aspect ratio in some places never mind HD, I remember when Top Gear did the polar challenge the year before it was a real showcase for HDTV.
Netbooks absolutely were overhyped, and the market for them died really quickly. They were barely usable, and by 2010 when tablets really started hitting the market, there wasn’t a space for them anymore.
I think the Netbook concept lives on in Chromebooks: Cheap, low power laptops that make sense in scenarios where higher cost laptops don't fit. Schools, kids, etc.
Some fraction of it was probably eaten by Raspberry Pi's as well. A 12V barrel plug was like the USB-C of 2008. For pennies, you got a intergrate anywhere Linux machine that could augment a lot of hackery.
Just to clarify their point, they weren't pointing out the mastodon instance, they were pointing out the LGBT TLD (top level domain). It's interesting, and not widely known, that anyone can make whateverwebsite.lgbt now. You could own the africangrey.lgbt domain for $11.99/yr if you wanted. Nothing to do with mastodon.
To be fair, a lot of these are accurate, or at least were at the time.
Multi-GPU just never caught on. There's a reason you don't see even the most hardcore gaming machines running SLI today.
The Wii's novelty wore off fairly quickly (about the time Kinect happened), and it didn't have much of a lasting impact on the gaming industry once mobile gaming slurped up the casual market.
Spore is largely forgotten, despite the enormous hype it had before release. It's kind of the Avatar of video games.
It took years for 64-bit to become relevant to the average user (and hell, there are still devices being sold with only 4GB of memory even today!). Plenty of Core 2 Duo machines still shipped with 32-bit versions of Windows and people didn't notice or care because basically no apps average people cared about were 64-bit native back then and you were lucky to have more than 4GB in your entire machine, let alone need more than that for one program.
Battlestar Galactica (2003) fell off sharply after season 2 and its ending was some of the most insulting back-to-nature religious tripe that has ever had the gall to label itself as science-fiction.
Downloading movies over the internet ultimately fell between the cracks outside of piracy. Most people stream films and TV now, and people who want the extra quality tend to buy a Blu-Ray disc rather than download from iTunes (can you even still do that with modern shows?)
I definitely know people who didn't get an HDTV until 4K screens hit the market, and people still buy standard-def DVDs. Hell, they're still outselling Blu-Rays close to 20 years later. Calling HD a dud is questionable, but it was definitely not seen as a must-have by the general public, partly because that shit was expensive back in 2008.
The Eee PC and the other netbooks were only good when they were running a lightweight operating system like Linux or Windows XP. Once Windows 7 Starter became the operating system of choice for netbooks, the user experience fell of a cliff and people tired of them. Which is a shame, because I love little devices like UMPCs.
The original iPhone was really limited for 2007. No third-party applications, no 3G support, no voice memos, you could only get it on a single carrier... the iPhone family did make a huge impact in the long run, but it wasn't until the 3GS that it was a true competitor to something like a Symbian device.
The only entry on this list that's really off the mark is Facebook, which even at the time was quickly reshaping the world. And I say that as someone who hates Zuck's guts and has proudly never had a Facebook account.
Possibly, now that we have much tighter integration between different chips using die-to-die interconnects like Apple's "UltraFusion" and AMD's "Infinity Fabric" to avoid the latency and microstutter issues that came with old-fashioned multi-GPU cards like the GTX 690 and Radeon HD 7990 XT.
As long as software can make proper use of the multiple processing units, I think multi-GPU cards have a chance to make a comeback... at least if anyone can actually afford the bloody things. Frankly, GPU pricing is a bit fucked at the moment even before we consider the idea of cards with multiple dies.
It's kinda sad how SLI and Crossfire faded away. I used to think multiple GPUs were amazing and wanted my own SLI setup! But technology went a different way. It's probably better for my wallet though!
Since I love playing devil's advocate, here's a couple of points in their defense:
Multi-GPU videocards: Pretty much dead, it's just not efficient.
64-bit computing: At the time was indeed slightly overhyped because while your OS was 64-bit, most software was still 32-bit, games in particular. So games couldn't really use more than 4 GB of memory. And that was standard for multiple years after this article (this was 2008, 64-bit Windows had been out for ages, and yet 3 years later the original Skyrim release was still 32-bit. Games having 64-bit binaries included was a huge thing at the time) Now most software is 64-bit and yes, NOW it's standard.
High definition: Depends, did they mean HD or Full-HD? Because the former certainly didn't last long for most people. Full HD replaced it real quick and stayed around for a while. Of course, if they meant Full-HD then hell no, they were hella wrong, it's been mainstream for a while and only now is being replaced by 1440p and 4K UHD.
iPhone: The FIRST one as a singular product really didn't live up to the hype. It was missing features that old dumbphones had. Of course the overall concept very much did revolutionize the phone market.
Well to be fair, changes like switching to 64 bit always are very slow (especially if they're not being forced by completely blocking 32 bit). But I don't think it was overhyped, it just takes time but more RAM was definitely needed to achieve the kinds of games/apps we have now.
Well by 2008 we'd had consumer-grade 64-bit CPUs for 5 years and technically had had 64-bit Windows for 3, but it was a huge mess. There was little upside to using 64-bit Windows in 2008 and 64-bit computing had been hyped up pretty hard for years. You can easily see how one might think that it's not worth the effort in the personal computer space.
I feel like it finally reached a turning point in 2009 and became useful in the early to mid 2010s. 2009 gave us the first GOOD 64-bit Windows version with mass adoption, and in the 2010s we started getting 64-bit software (2010 for Photoshop, 2014 for Chrome, 2015 for Firefox).
It was different for Linux and servers in particular of course, where a lot of open source stuff had official 64-bit builds in the early 00s already (2003 for Apache for an example).
I had a HP Mini311, Atom CPU and 11.6" screen, 3GB of RAM, wifi/bluetooth, and it had a dedicated NVIDIA GPU, it was impressive at the time in 2008, and a HDMI port, was able to read 1080p using GPU only. I loved the form factor. Since then I favour laptop with 13 or 14", I don't want/need a 17.3" laptop with DVD and all!
They're probably referring to streaming. I don't know when this article came out, but considering they're talking about literally the first iPhone I guess we can assume it's 2007 or 2008, and Netflix started streaming back in 2007.
Apart from that, perhaps they're referring to when Amazon started offering movies to buy or rent online, but I don't know when they started doing that.
I was going to say that I agree iPhones are smartphones are of no benefit, they don't do anything, I'm being sarcastic, but looking closer at the list, how did they get it all so very wrong?
Don't need 64 bit for more than 4GB? Every new computer should have 32GB and 64GB is not unreasonable.
Don't need full HD? How does 8K resolution sound with 16K being developed?
I question their basic knowledge and experience with technological advancements for higher demand, more complicated work loads, and adcancements for security protections like 64 bit memory address randomization that can't exist on 32 bit hardware.
Haha, I mean I'm using up 8 gigs of ram with 10 chrome tabs open, 16gb is definitely starting to look a little lightweight than I'd like at the moment. I built my PC in 2020 with 16gb and around the end of 2021 I ended upgrading with two more sticks, just to keep things running smoothly!
To be fair, Spore was overhyped - it was fun enough, but not the total gamechanger that it was forecast to be. Will Wright had two amazing strikes with Sim City and then the Sims, and then a whole pile of very middle-of-the-road simulation games, so it wasn't that hard to foresee.
And EEE PCs occupied the uncomfortable niche where they didn't do a lot that your phone couldn't, while being extremely limited compared to a £300 'proper' cheapo laptop. That's not really a business model.
So yeah, that's two things that anyone could have seen coming, versus eight where they're so massively completely wrong they couldn't have failed harder if they tried. Would have been better to call this list 'things which are not massively overhyped', they'd have done better.
While I don't think the iphone is over-hyped I've started fantasizing about going back to a dumb phone. The LG enV 2 is still the best phone I ever owned.
To be fair the only thing that iphone had going for it was the form factor and the screen. Releasing without 3g or picture messaging totally put me off of it when it came out. Obviously it got better though lol
Yeah, Apple spent a lot of investment getting the thing going, but it is certainly still a modern powerhouse. Even if worldwide iOS has a smaller marketshare than Android, Apple is huge in Japan, the US and UK.