What I'd really like is a TV with DisplayPort. How is this not a thing? IIRC you cannot buy a display with DP that's larger than 45 inches, give or take - they just don't exist. I think this is really weird. Like, I'd pay an extra $100 for that port, but I'm just not allowed to have it.
I absolutely love my Aorus 48" OLED-type display (w/ DisplayPort).
I tried a 48" TFT-type television (attempting use as a computer display) and the refresh rate just wasn't there, along with typical backlight splotching (but it cost a fifth as much, so...).
My only caution is OLED can experience burn-in (unlike the smaller Aorus 45" using a VA-type panel), but it is otherwise a much better experience
The other limitation is lower brightness than miniLED monitors, around 30-60% of the nits in SDR. Whether that matters obviously depends on the ambient light or reflective surfaces near you.
For me, because I'm next to a big window and already squinting at my 400 nits IPS monitor, a < 300 nits OLED is a non-starter, but a 600 nits in SDR, IPS miniLED, is ideal.
This limitation should be temporary however because there are some high nit OLED TVs coming on the market in 2025 so bright OLED 27-43" monitors will likely follow.
330 nits in SDR is good relative to other OLED monitors and good enough for most indoor environments but not good enough for my indoor environment. Windows are too big and not tinted, just too much ambient light for anything below 500 nits.
As far as I am aware, after having done exhaustive research on this, its licensing costs and popularity. Display port simply isn't popular enough. The vast majority of TV manufacturers (not brands mind you, many white label their manufacturing to different brands) also make monitors, and adoption of HDMI across both tvs and monitors not only was much higher, it was overall cheaper in cost since you could share the same components across lines. This being driven by cheaper licensing costs for accessory manufacturers (like blu ray players).
Its also easier to implement, if I recall correctly
This is the essential core of it, as I have come to understand it anyway.
New Hisense TVs have USB-C DisplayPort support. Pretty cool, but realistically I don't see how it's different from HDMI from a usefulness standpoint.
Edit: It is cool I can plug my phone or laptop into the TV with one cable, no adapters, and get some power as well. For some reason it didn't work with my Steam Deck which was strange.
This. I was reading about some of the ugly hacks Valve has had to get around to use 2.1 on the steam machine. They (HDMI consortium, whatever its called) won't let you use 2.1 if your video drivers are FOSS. Since SM has open drivers for the AMD card it's leading to subobtimal video output at certain resolution/framerate combos (4K@120fps? Something like that), and they can't legally advertise support for HDMI2.1.
There absolutely are ways to do this, some motherboards have a DP-In connector that is routed to the USB4 ports. One example would be the ProArt X670E.
It’s nice but OLED contrast is very hard to beat, and if you’re one of those folks who insist that ‘a white wall is good enough’ then it’s not even the same ballpark of image quality.
I have a NEC P462 display with DP among other things. It's about that size you said, so maybe you're right, but my first thought is that there's gotta be bigger displays for digital signage, and why wouldn't they have DP if this one does? NEC and Samsung both make these types of displays, IIRC, not sure who else.
As opposed to the DisplayPort cable, DisplayPort standard, or DisplayPort encoding that's sent over the wire, yes. This isn't a PIN number situation despite the stutter.
I tried to buy a good 32 inch tv. This is also hard. I need up going a little matter and even then, the utterly trash built in speakers frustrate the hell out of me.
A 32" 4k 240hz OLED computer monitor + smart TV HDMI dongle + external speakers should work fine. Only point I would check is if the remote that comes with the dongle can turn on the monitor.
Why would you want such a thing? HDMI 2.1 does HDR 4k @ 120hz without compression. The entire TV ecosystem uses HDMI. If you want to connect a PC to a TV they always have at least 1 HDMI out, and some have a couple.
Because HDMI 2.1 uses a proprietary protocol that's not implemented in any free OS[0]. If you want to use HDMI 2.1 features right now, your only option is to use a non-free OS like Windows or MacOS.
from a purely technical point of view i do wish HDMI 2.1 was able to gain traction. On a couple of things I own that do actually use it, its an actual noticeable improvement and I feel does a better job than DisplayPort.
Granted, I suspect quite strongly the next wave of consolidation is going to continue the trend of being around USB-C, since the spec should have the bandwidth to handle any video / audio protocols for quite some time. Matter of time until that happens IMO.
It also lets you have a single cord that could theoretically be your power cord and your A/V cord.
From a purely technical standpoint display port is a better standard. HDMI couldn't get their shit together to do anything with USBC and thus all USBC to HDMI converter cables run display port internally.
Display port already allows multiple video streams, ausiostreams ... Why do we need a closed standard to also do this?!?!
Not really. That same link talks about how Intel and nvidia drivers can provide HDMI 2.1 on Linux but it is via their non-free firmware blob.
AMD doesn't (can't? won't?) do the same but there is a workaround: a DisplayPort to HDMI adapter using a particular chip running hacked firmware. That'll get you 4K 120 Hz with working FreeSync VRR.
I don't remember where,but somebody explained that the adapters also have some kind of limitation. I can't remember what but they went into deep details and the whole thing is revolting. Governments should protect open source.