Widescreen has been the movie industry standard for how many decades now? IMAX is its own beast but most movies aren't filmed in real IMAX resolution and now there's digital IMAX which is basically 19:10 which is the same as many TVs...
Movies used to be all 4:3 before tv. It's called the academy ratio. Movies now do 1.85:1 and even 2.39:1. A few even do anamorphic 2.76:1. Anything but the dominant home format.
Major movie studios have mostly used widescreen since the 1950s and all the different ratios you mentioned except 4:3 are better watched on a widescreen TV than a 4:3 TV.
It's not as much about resolution as it was about exploiting the quirks of CRT. Artists usually "squished" sprites horizontally (because crt screens would stretch them) and used the now famous "half dot" technique to have more subtle shading than what was actually possible at the pixel level. So if you just display the original sprites with no stretch and no "bleed" between pixels, it doesn't look as good as it should.
That’s because the graphics were tailored to CRT resolution - which is to say, low/outright bad resolution.
No, it's because the graphics were tailored to the analog characteristics of CRTs: things like having scanlines instead of pixels and bleed between phosphors. If they were only tailored to low resolution they'd look good on a low resolution LCD, but they don't.
I admit I'm quibbling, but the whole thread is that, so...
CRTs don't have pixels so the resolution of the signal isn't that important. It's about the inherent softness you get from the technology. It's better than any anti-aliasing we have today.
I grew up on 2600 on a tv in the 70's. Computer graphics on crts were incredibly jagged. If you used a magnifying glass on a pixel it was blurred misconverged spot because it didn't hit the shadow mask exactly on target.
The jaggedness of the 2600 wasn't because the TV itself was jagged; it was because the 2600 was so low-resolution (160x192, maximum) that it had to be upscaled -- naively, with no antialiasing! -- even just to get to NTSC (480 scanlines, give or take).
So yeah, when each "pixel" is three scanlines tall, of course it's going to look jagged even after the CRT blurs it!
A high resolution LCD with anti aliasing will do a better job than a low resolution crt. Crt shadowmasks defined the limits of pixels and it wasn't good even on computers that could output higher than 2600 resolution.
CRTs do have pixels. If they didn't, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.
I certainly saw aliasing problems on CRTs, though usually on computer monitors that had higher resolution and better connection standards. The image being inherently "soft" is related to limited resolution and shitty connections. SCART with RGB connections will bring out all the jagginess. The exact same display running on composite will soften it and make it go away, but at the cost of a lot of other things looking like shit.
CRTs do have pixels. If they didn’t, you could run an SVGA signal (800x600 at 60 Hz) directly into any CRT. If you tried this, it would likely damage the tube beyond repair.
Would it, though? I'm skeptical.
If it did, it wouldn't be because they have "pixels," though; it would be because overdriving the deflection yoke with higher-frequency signals would generate too much heat for the TV to handle.
Otherwise (if it didn't overheat), it should "work." The result might look weird if the modulation of the signal didn't line up with the apertures in the shadow mask right, but I don't see any reason why sweeping the beam across faster would damage the phosphors. (Also, I'm not convinced a black & white TV would have any problem at all.)
It will tend to turn the beam on when it's off to the side, outside the normal range of the screen. X Windows users in the mid 90s had to put in their exact scanline information or else the screen could blow up. That went away with a combination of multiscan monitors and monitors being able to communicate their preferred settings, but those came pretty late in the CRT era.
Edit: in any case, color screens need to have at least bands of red/green/blue phosphor. At a minimum, there will be breaks along either the horizontal or vertical lines, if not both.
When you say "blow up" do you mean the tube would literally explode, it would burn through phosphors, a circuit board would let the magic smoke out, or something else?
I remember configuring mode lines in X. Luckily, I never found out the hard way what happened if you got it wrong.
Maybe one's before the 80s didn't, but they almost all do after that. Exactly how this worked varied between manufacturers and the display type, but they tended to have some kind of mask that pushes it to a preferred resolution.
CRT filters exist now, and with HDR output (or just sending an HDR-enable signal to get tv's to use the full brightness range) and 4k displays it honestly as good at this point. or better because the only good CRT's you can get now are pretty small P/BVM and my tv is much bigger than those
There are plenty of upscalers with minimal latency that fix that.
There also isn't just "CRT" in this space. Professional video monitors give a very different picture than a consumer TV with only the RF converter input.
If one more under 25 retro fan tells me that RF tuners are the "true experience", I'm going to drink myself to death with Malort.
Edit: please don't tell me you believe CRTs have zero latency. Because that's wrong, too.
And most people were lucky to have a TV. You were lucky to have a HOUSE! We used to live in one room, all hundred and twenty-six of us, no furniture. Half the floor was missing; we were all huddled together in one corner for fear of FALLING!
Sometimes I think about how some technologies could have evolved if they didn't get out of fashion. I always thought it's a bit unfair to compare products made decades ago with new ones and use it as a comparison for the whole technology.
In the case of crts, it would be totally possible to make them with modern aspect ratio and resolutions. The greatest challenges would probably be size, weight and power consumption.
For TVs, that's just because they didn't need any more resolution because the signal they were displaying was 480i (or even worse, in the case of things like really old computers/video game consoles).
My circa-2000 19" CRT computer monitor, on the other hand, could do a resolution that's still higher than what most similarly-sized desktop flat screen monitors can manage (it was either QXGA [2048x1536] or QSXGA [2560x2048], I forget which).
And then, of course, there were specialized CRT displays like oscilloscopes and vector displays that actually drew with the electron beam and therefore had infinite "resolution."
Point is, the low resolution was not an inherent limitation of CRT technology.
They did break, You know? My father fixed those things, it's that they were actually fixable back then and it was cool.
Or maybe it was just russian tech that broke, we lived in one of those ussr sattellite countries.
Stupid false nostalgia, just like the old c10 pickup trucks. They are rare now because they are SHIT and nearly all of them were scrapped like they deserve.
My '96, quarter-million-mile Ford fuckin' Ranger is still running. I love it partly because it's shit. It's incredibly cheap, it hauls stuff, and I don't have to care about it. Similarly, anybody coveting a C10 knows exactly what they're getting into.
Also, I've still got a CRT TV in my back room and a couple of CRT monitors stored in the basement. I'm well aware that they're not as good as my LCD TVs and monitors in every single way, except that they're good for accurate retrogaming, so I keep them around for that purpose and that purpose only. (I'm also under no delusion of them lasting 50 years, contrary to the meme.)
Mine also has a manual transmission and lever-operated 4x4 transfer case, but is a regular-cab 2.3L.
I picked it on purpose because I wanted the most efficient 4x4 truck I could find, but now (with kids and with towing/hauling more than commuting) I'd be better off with one like yours.
Okay, so technically CRTs implode, but the result of the implosion can be an explosion. What happens with a CRT implosion is that the the glass gets sucked into the back of the tube with so much force it'll bounce off the back of the tube and come out the front. So they kinda implode and explode. Combine that with the glass being leaded and there's a reason you really shouldn't go out smashing CRTs.