It's always, "Oh, well, you can no longer run two or three monitors any more, but your primary display is higher resolution now!" Except DPI adjustments make it irrelevant and now my (i)GPU has a higher minimum load.
Or, "Oh, well, we only give you 2 ports now, but they're all <port>!" Great, but those larger bandwidth ports don't offset the fact that I can't plug in as much any more, and USB hubs are not a solution, they're a hack, wildly variable in operation, and some devices are not compatible with them.
I prefer it over the replacement approach that modern desktop environments (and wayland) use. I've been exclusively using high-DPI displays for much longer than Mac OS or Windows have supported them, and the old approach was much better.
There's some argument that you need to blur everything badly (instead of setting a session-wide DPI) if the user is simultaneously using two displays with wildly different DPI's. That user is going to have a bad experience no matter what, so I've never understood that argument.
I switched to devaun, and things are much better, for now. It's unclear how long new software will keep reliably working under X11 without systemd.
Anyway, as a sighted user, my experience almost exactly matches the article, toned down about 10x.
(Concretely, on the systemd side: I hit the same issues with pulseaudio, and the new session stack regularly perma-blanked by screen until I rebooted. I can't reliably share machines with family members because elogin is so bad.)
https://news.itsfoss.com/gtk-drops-x11/ https://arcan-fe.com/
From little things to kernel lockdown breaking hibernate on a fully encrypted system just because you should be happy to get your laptop battery killed by s2idle or disable secure boot. Yay, security.
I can only imagine the pain of all the accessibility issues on top of what I experience.