I thought audio might be the reason, for as far as I can tell, displayport supports that too.
It took a long time to move from the old component input over to HDMI. The main thing that drove it was the SD to HD change. You needed HDMI to do 1080p (I believe, IDK that component ever supported that high of a resolution).
Moving from HDMI to display port is going to be the same issue. People already have all their favorite HDMI devices plugged in and setup for their TVs.
You need a feature that people want which HDMI isn't or can't provide in order to incentivize a switch.
For example, perhaps display port could offer something like power delivery. That could allow things like media sticks to be solely powered by the TV eliminating some cable management.
It already does. A guaranteed minimum of 1.65W at 3.3V is to be provided. Until very recently, HDMI only provided a guaranteed minimum of something like 0.25W at 5V.
5W is what I'd think is about the minimum for doing something useful. 25W would actually be usable by a large swath of devices. The raspberry pi 4, for example, has a 10W requirement. Amazon's fire stick has ~5W requirement.
Sure. But it's ~6.6x more than what HDMI has historically guaranteed. It's pretty obvious to anyone with two neurons to spark together that the problem here isn't "amount of power you can suck out of the display port". If it were, DP would have swept away HDMI ages ago.
For the same sorts of reasons that made it so for decades nearly every prebuilt PC shipped with an Intel CPU and Windows preinstalled: dirty backroom dealings. But in this case, the consortium that controls HDMI are the ones doing the dealings, rather than Intel and Microsoft.
"But Displayport doesn't implement the TV-control protocols that I use!", you say. That's totally correct, but DisplayPort has the out-of-band control channel needed to implement that stuff. If there had been any real chance of getting DisplayPort on mainstream TVs, then you'd see those protocols in the DisplayPort standard, too. As it stands now, why bother supporting something that will never, ever get used?
Also, DP -> HDMI active adapters exist. HDR is said to work all the time, and VRR often works, but it depends on the specifics of the display.
In my case I have an htpc running linux and a radeon 6600 connected via hdmi to a 4k @ 120hz capable tv, and honestly, at the sitting distance/tv size and using 2x dpi scaling you just can't tell any chroma sub-sampling is happening. It is of course a ginormous problem when on a desktop setting and even worse if you try using 1x dpi scaling.
What you will lose however is the newer forms of VRR, and it may be unstable with lots of dropouts.
(I assume VRR = Variable Refresh Rate)
Most TVs will not let you set the refresh rate to 100hz. Even if my computer could run a game at 100hz, without VRR, my choices are either lots of judder, or lowering it to 60hz. That's a wide range of possible refresh rates you're missing out on.
V-Sync and console games will do this too at 60hz. If you can't reach 60hz, cap the game at 30hz to prevent judder that would come from anything in between 31-59. The Steam Deck actually does not support VRR. Instead the actual display driver does support anything from 40-60hz.
This is also sometimes an issue with movies filmed at 24hz on 60hz displays too: https://www.rtings.com/tv/tests/motion/24p
Most single player games (Spider-Man, God of War, Assassin's Creed etc) will allow a balanced graphics/performance which does 40 in a 120hz refresh.
(Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)