Decades ago, the TV took five minutes to warm the tubes up before one could watch the news.
Today, the TV takes five minutes to boot, install updates, and mangle some configuration before one (eventually) can watch the news - if the TV has not lost it's list of stations again.
I once got gifted a TV from a nice elderly guy. The TV had been edge of technology when it was built: It had a wireless remote! Although the remote worked with ultrasound instead of infrared...
This beast took several minutes before it actually showed a picture.
I tell my laptop to put the video in the vga port. It does. That’s it. There’s nothing plugged in, but it’s there.
I plug a vga cable in. There’s video in there now. With enough paperclips, I could get it out the other end. My laptop does not care. It wiggles the electrons regardless.
I plug the other end of the cable in. The shielding was eaten by mice and two pins are dead. But alas, lo and behold, purple tho it may be - the video comes out and is displayed.
Meanwhile, hdmi protocol negotiation wants to know if you’d like to set your screen as the default sound device. Not that teams would use it anyway. Actually nevermind, the receiving end doesn’t support the correct copyright protection suite. Get fucked, no video for you.
Feels like everything is much more a faff to set up, then one bit updates & something or other is longer compatible.
Don't even want to think about the waste it must generate, both of devices & of the hours trying to get things to work whether at the development end or in the home.
at this point i don't understand why people bother with TVs rather than just hooking up an actual normal computer to a big screen and just watching youtube or torrenting media
Even old flat screens are ridiculously heavy compared to new ones. I replaced an old Sony 720p screen that weight probably 20 pounds with a 1080p smart TV of the same size that I could lift one-handed. And the new one cost less than $200.
I grew up with CRTs and VCRs, hard pass. There's a certain nostalgia to it all: the bum-DOOON sound as its electron gun warmed up, the smell of ozone and tingly sensation that got exponentially stronger the closer you were, crusty visuals... But they were objectively orders of magnitude worse than what we have now, if nothing else than because they don't weigh 150 pounds or make you wonder if watching Rugrats in Paris for the 30th time on this monster is giving you cancer. Maybe it's because I'm techie, I've never really had much issue with "smart" TVs. Sure, apps will slow down or crash because of memory leaks and it's not as customizable as I'd like, but I might be satiated just knowing that if push comes to shove I can plug in a spare computer and use it like a monitor for a media system.
I'm rooting it if it starts serving me out-of-band ads, though.
I feel like I've missed something. I don't dispute any of the horrible experiences people have had, however I've had nothing but good luck. The only thing about our current television that bothers me is the promotional wallpapers that get applied every-fucking-time a new Disney property needs advertising. We buy relatively modestly priced units in the $300-$500, so maybe we just have different expectations than someone buying a much more high end unit. It is also possible that it has been pure luck and I'll reply to this message one day soon to recant everything.
Some people don't base their entire personality around hating the existence of ads and jumping through outrageous amounts of steps to avoid them.
So yeah, count me in for a TV that always works how I want it to but had a background ad that I can completely ignore and has no actual bearing on my life.
There are so many more important things for me to spend my time and energy worrying about.
This is less an issue of "smartness" and moreso because analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place. HDMI hits kind of a weird spot because it's a digital protocol based on analog scanlines; if the signal gets disrupted for 0.02 ms, it might only affect the upper half and maybe shift the bits for the lower half. Digital is more contextual and it will resynchronize at least every frame, so this kind of degradation is also unstable.
analog signals degrade gracefully whereas digital signals are all or nothing unless specific mitigations are put in place
Not really. Digital signals come over analog mediums, and it's up to the receiver to decide how much degradation is too much. Mitigations like error correction are intended to reduce the final errors to zero, but it's up to the device to decide whether it shows/plays something with some errors, and how many of them, or if it switches to a "signal lost" mode.
For example, compressed digital video has a relatively high level of graceful degradation: full frames come every Nth frame and they are further subdivided into blocks, each block can fail to be decoded on its own without impacting the rest, then intermediate frames only encode block changes, so as long as the decoder manages to locate the header of a key frame, it can show a partial image that gets progressively more garbled until the next key frame. Even if it misses a key frame, it can freeze the output until it manages to locate another one.
Digital audio is more sensitive to non-corrected errors, that can cause high frequency and high volume screeches. Those need more mitigations like filtering to a normalized volume and frequency distribution based on the preceding blocks, but still allow a level of graceful degradation.
I hate smart sutff so much. It's fucking impossible to find a good TV that don't have all of this shit thrown in now. I just want a nice display. And that's it. But what's worse, is when, not only it comes with awful software but they also take "the Apple route" for their features/services. So you have an issue like this and you can't do anything about it.
What am I talking about, you say? "The Apple way", but I allso like to call it the "fucking magic" syndrome. The "fucking magic" syndrome is when something is supposed to be "magic", to "just work", BUT WHEN IT DOESN'T... you're shit out of luck. :)
Because you see, it's supposed to just work. It's absolutely inconceivable for it to not just work. So the people who made it and never even for a second considered that it might fail, never took the time to implement some kind of failsafe in the UI to allow you to actually force the thing to do it's thing on the off chance where the rabbit just refuses to come out of its hat. So when
Anyone who's had to update their Airpods knows exactly what I'm talking about. They're supposed to update themselves, without you doing anything. But every now and then... THEY FUCKING DON'T AND THERE IS ABSOLUTELY ZERO WAY TO FORCE THEM TO DO SO! You just have to wait with your Airpods in their cases open for the moon to be in the correct position or something.