HDMI is the proprietary monopoly scam. It is added to devices by the owning members of the scam. Display Port is the open source free equivalent standard that the educated consumer goes looking for.
Thank you, I did find myself thinking theres a reason why I have the DP cables for my PC monitors which don't seem to have an issue running high resolutions... But then I'm not running 8k on anything so I wasn't really sure about that
That's just a commercial display. Most commercial displays don't have an OS and require a separate device for showing video like an Nvidia Shield, PC, etc.
Is DVI completely out of the picture? I hate the connector, but I've had a lot of issues with DP, mainly around Linux support and multi-monitor setups.
I was kinda hoping USB-C/4/Thunderbolt would step into this space and normalize chaining and small connectors, but all of those monitors are stupidly expensive.
I also find USB to be limiting when it comes to range. I can go about 50 feet with a nice thick HDMI with copper wiring, but any further than 20 feet on USB necessitates fiber optics. Not an issue for everyone, but something I have been running into.
DVI isn’t capable of the bandwidth needed for higher resolutions. Even dual link maxes at about 8 Gbps and 2560x1600 @ 60Hz. This new HDMI spec is 96 Gbps for reference.
Ironically though HDMI is pin compatible with DVI and you could output HDMI to a DVI monitor with just a simple HDMI to DVI cable, or vice versa. I know a lot of people who like DP bit in order to convert you need active circuitry and that can impact quality if you don’t have native DP on both ends.
There are a few broadcast 8k channels in Japan and South Korea. There’s some YouTube 8k videos and 16k is being worked on. 8k is pretty awesome, though I really just want 8k screens for large PC monitors. I currently use a 4k 43” and 8k would be even better at that distance. Both Samsung and Sony have 8k screens for sale right now and they’re not really that crazy expensive for cutting edge. (75” Samsung 8k QLED for $3k)
I haven't even gotten on the 4k bandwagon yet. I fully expected to by now, but then again, my eyes aren't getting any better and 1080p content still looks... fine.
I have to filter out all the 4K feeds I get on Kodi because I can't play them. I sure haven't seen a shortage of them. Now whether they play at an actual 4K would be the question, but they've been there for years.
Higher refresh rates for movies are meh at best, VRR OTOH is a godsend as 24Hz just won't fit into 60Hz. Gaming, too, is much nicer when you have VRR, figures that delayed frames are quite a bit less noticeable than dropped frames.
A few weeks ago I watched Ladyhawk on a 13" TV with a built in VHS player. I realized that my brain didn't care about the quality as soon as I started paying attention to the content. I still like my 1080p but there's definitely massively diminishing returns after that.
A buddy of mine worked in a theatre and told me that the film's were all 1080P. I called bullshit. Those screens were huge they were clearly 4K. He showed me the reel and yup he was right.
If theatres don't even bother with 4K, your TV doesn't need 8K.
Actual film doesn't work like that (35mm or 70mm IMAX for example), but you are correct that most cinemas these days are digital and they use "1080p" (more accurately DCI 2K which is 2048×1080 when the aspect ratio is 1.90:1). There are a few that do 4K, but overall not that many.
The main reason that's enough for cinema though is that those "1080p" films are like 500GB with very little compression displayed through a DLP projector, so they look a heck of a lot better than showing a blu-ray through a massive TV with palm sized pixels.
The HDMI standard needs to declare cable bankruptcy and start over with a new port design. We all have way too many HDMI cables supporting 23 years of standards. There is nothing in the specification to clearly label, across brands, what type of hdmi specification is supported by the cable or port.
Also, the DRM baked into the specification is such bullshit.
Also, the DRM baked into the specification is such bullshit.
That's the one thing they have absolutely no interest in getting rid of. They'll change everything about the spec, including the connector, but that part's staying in.
They need to switch to fiber optic or a simple coaxial cable (like SDI) with BNC connectors. That would end this madness with long cables being outrageous, and the connectors being flimsy.
They also need to separate audio back out into its own thing.
I believe 4K is already basically there. I have a 50" 4K (2160p) that I sit 9 feet away from and based on the Nvidia PPD calculator, that makes for 168ppd, and according to that page 150ppd is around the upper limit of human vision. Apple's "retina" displays target around 50-60ppd (varies based on assumed viewing distance), which is what most people seem to consider "average eye visual acuity". Imo 4K / 150ppd is more than enough.
According to this calculator, my 65" 4k setup is around 100ppd.
I find that anything with a higher density than that (e.g. sitting further away, or replacing it with an 8k screen of same size) requires scaling up text and wasting a lot of pixels when rendering other things.
So yeah, I think 8k is a total waste if you're not targeting a much higher fov, at which point a curved screen would probably be better.
Maybe some applications like these could need a high density just for the size of it, but then again you're not likely to be looking from a living room distance either. Or things like VR where you're looking from very close up.
My biggest screen is a 55" 4K and I just don't get why you would need much more unless it was a full on theater setup.