Preferences

play_ac
Joined 36 karma

  1. No, crypto exchanges are only profitable as a result of massive wash trading and scamming. If they had to actually compete the margins would be hilariously low. Probably even lower than a typical bank because the product is just worse.
  2. For starters, any of the money laundering crimes that CZ just pleaded guilty to. That's what any of these cryptocurrencies mean when they say transactions can't be tracked.
  3. >Uh... don't expose your X.org server to the internet naked.

    This is not something the X maintainers can say. They can encourage people not to do it but if they stop maintaining that feature then the complaints start to roll in because someone somewhere was using it. If you think this situation is awful then yes, you're starting to get it: X is in a bad spot where these broken insecure features are holding else everything back and will continue to do so as long as people depend on it. At best they can disable it by default and make it hard to accidentally re-enable it, which is what they've already done.

    >That's not something a normal desktop install does.

    Yes, most normal desktop installs don't use X11 in any capacity. They use Microsoft Windows.

    >It's not actually a problem that my applications are powerful and can do what I want them to do.

    I notice you didn't actually respond to my comment about stopping using passwords and private keys and running everything as root. Because I'd bet even you draw a line somewhere, in a place where you think it's a risk to give an application too much power.

    >It is a problem that other locked down OSes like Macs and smartphone systems are not in the user's control and programs cannot do many things by design.

    This has absolutely nothing to do with Linux or even on those systems either. It's not actually a problem there. If you have root on the system then you are in control and can do whatever you want anyway. The purpose of setting security boundaries and not running everything as root is because not everything needs to access everything else all the time. The security model you're suggesting became obsolete by the mid 1990s.

    And let me say this again so it's perfectly clear. When you use X11 there is effectively no security boundary between any X11 clients. So if you start up your root terminal or you use sudo or anything else like that, then any other X11 client on the system also gets root. This is unacceptable and I can't believe I still have to continually point this out to long time Linux users that should be technical enough to understand. It doesn't even matter if you personally think it's fine to run everything as root: maybe you do. But as a user you should have enough understanding of the system to know that this absolutely is not ok for lots of other users and it's simply not appropriate to be shipped as the default in the year 2023.

    These are not fantasy issues, these are actual issues that the underlying system was purposely designed to fix. X11 pokes a huge gaping hole in it.

    >sharing keyboard/mouse with synergy/barrier/etc is secure.

    No. On a typical X11 install it's not, because it relies on insecure APIs.

  4. That won't happen and would actually be much worse for Monero because it means everything becomes a giant target for thieves and scammers, even more than it already is. The reason it's failed is because the idea of cryptocurrency is fundamentally bad. Monero isn't even trying to hide it. The developers openly say that criminals should use it to commit crimes.
  5. "Usable" is a massive stretch. The only way most people will ever be able to use it is through a custodial wallet, so it's right back to bank accounts and centralized exchanges.

    But the whole thing is a distraction anyway. The majority of transactions happening off-chain means that Bitcoin is an utter failure at everything it ever set out to accomplish.

  6. The efficient ones are still outright scams if not blatantly illegal. All cryptocurrencies should die. After more than a decade it's clear by now that blockchains are a useless technology and the investors are getting more and more desperate to pass the bag.
  7. >Why do you think people cannot stand writing with VSCode for example?

    Which people? Every recent study I've seen shows VSCode as the most popular code editor by a large margin. Maybe latency isn't as important as you think?

    >Are you saying that latency in the order of 250ms when editing text is unnoticeable?

    No. Sorry for the info dump here but I'm going to make it absolutely clear so there's no confusion. The latency of the entire system is the latency of the human operator plus the latency of the computer. My statement is that, assuming you have a magical computer that computes frames and displays pixels faster than the speed of light, the absolute minimum bound of this system for the average person is 250ms. You only see lower response time averages in extreme situations like with pro athletes: so basically, not computer programmers who actually spend much more time thinking about problems, and going to meetings, than they actually spend typing.

    Now let's go back to reality: with a standard 60Hz monitor, the theoretical latency added by display synchronization is a maximum of about 16.67ms. That's the theoretical MAXIMUM assuming the software is fully optimized and performs rendering as fast as possible, and your OS has realtime guarantees so it doesn't preempt the rendering thread, and the display hardware doesn't add any latency. So at most, you could reduce the total system latency by about 6% just by optimizing the software. You can't go any higher than that.

    However, none of those things are true in practice. Making the renderer use damage tracking everywhere significantly complicates the code and may not even be usable in some situations like syntax highlighting where the entire document state may need to be recomputed after typing a single character. All PC operating systems may have significant unpredictable lag caused by the driver. All display hardware using a scanline-based protocol also still has significant vblank periods. Adding these up you may be able to sometimes get a measurement of around 1ms of savings by doing things this way, in exchange for massively complicating your renderer, and with a high standard deviation. Meaning that you likely will perceive the total latency as being HIGHER because of all the stuttering. This is less than 1% of the total latency in the system and it's not even going to be consistent or perceptible.

    Now instead consider you've got a 360Hz monitor. The theoretical maximum you can save here is about 2.78ms. This can give you a CONSISTENT 5% latency reduction against the old monitor as long as the software can keep up with it. Optimizing your software for this improves it in every other situation too, versus the other solution which could make it worse. If it doesn't make it worse, it could only save another theoretical 1% and ONLY in a badly perceptible way. It just doesn't make sense to optimize for this less than 1% when it's mostly just caused by the hardware limitations and nobody actually cares about it and they're happy to use VSCode anyway without all this.

    So again, you can avoid these accusations of "utter nonsense" when it's clear you're arguing against something that I never said.

    >The perceived latency when editing text is between pressing a key and your brain telling you "my eyes have detected a change on the screen.

    Your brain needs to actually process what was typed. Prediction isn't helping you type at all, if it did then the latency wouldn't matter anyway. If you're not just writing boilerplate code then you may have to stop to think many many times while you're coding too.

  8. >I've never heard of anyone having an X11 security problem in the last 20 years.

    Here's 6 CVEs just from last month. Check the mailing lists and you'll see many of these going back for years and years.

    https://lists.x.org/archives/xorg/2023-October/061506.html

    https://lists.x.org/archives/xorg/2023-October/061514.html

    And before you say this is not what you meant, the X server and X client libraries do very little anymore besides parsing untrusted input and passing it somewhere else. That's its main purpose and it's completely bad at it. And because it's X, this input can also come from over the network too so every normal memory bug can also be an RCE. This is probably the single biggest attack vector on a desktop system aside from the toolkit. It's the exact wrong thing for anyone to grant access to every input on the system.

    This is not just my personal opinion or me giving anecdotes either, this is paraphrasing what I've heard X developers say after many years of patching these bugs. But that's not even the whole problem as I'll explain shortly.

    >But for actual computers you control it just isn't (a problem). Wayland for "security" is cargo culting smartphone user problems. It's not actually a real issue.

    Yes it is a problem and no it's not cargo culting. Practically speaking the X11 security model means every X client gets access to everything including all your passwords (and the root password) as you type them, and subsequently lets every X client spawn arbitrary root processes and get access to your whole filesystem including your private keys and insert kernel modules or do whatever. If you actually think this "isn't a real issue" then you should just stop using passwords, stop protecting your private keys, run every program as root, and disable memory protection: because that's what this actually means in practice. No I'm not exaggerating. The security model of X11 has no idea about client boundaries at all. This is completely unacceptable on any other OS but for some reason it's become a meme to say that only smartphones need to care about this. Really? Come on.

    >I use the keyboard/mouse sharing in X11 (via synergy) and I have for 20 years. It is vitally important to my workflow. It works on dozens of different OSes including linux. But not the waylands linuxes. Any graphical environment that can't do this is useless to me.

    X11 can't do it securely so I would say that's as useless as not implementing the feature, if you have to compromise your security in order to get it.

    The feature will be implemented in Wayland eventually when the design for a secure API is finished. There are people working on it now. In comparison, X11 is probably never going to gain a secure way to do that.

  9. Can you tell me which part of this you're referring to?
  10. No, really. Those APIs are too low level to be useful for normal applications. Nothing in them is useful for games at all. I don't know why you think it's appropriate to put in these insults either. Cut it out.
  11. >Now that's an ahistorical conspiracy theory

    No? Where exactly do you think I've theorized about the existence of a conspiracy? Because I've actually said the exact opposite: there isn't a conspiracy and no one is cooperating at all. There's no evil group of developers secretly planning to sabotage everything. It's just the usual bad communication and planning that happens with a distributed team.

    >Those diverse desktop environments contributed hugely to GTK, GNOME just didn't use their work

    Can you name what any of these contributions were? Because I've never seen them. I've seen contributions here and there, lots of minor bug fixes, but nothing major.

    >Nobody is going to fully "kiss the ring" unless they get something out of it

    Avoid this rhetoric please. These open source projects are a volunteer collaboration. No one's kissing any rings or trying to get something out of the maintainers, other than the usual: everyone helps each other write and maintain the code.

    >but they could have done a lot better than fighting third-parties tooth-and-nail. GNOME should be a proud project that leads the GNU movement

    I really don't know what you're talking about here, but disagreeing about technical things isn't "fighting tooth-and-nail". That's a normal part of any project.

    Personally I don't think anyone should care about leading the GNU movement, that's been plagued by petty infighting and drama since the very beginning.

  12. Your comment has nothing to do with the conversation. The reason to have low latency when typing text is so you can correct mistakes. That requires the full response time. There's no moving or evolving shapes. Maybe proofread your own comments before throwing around accusations of "utter nonsense".
  13. The Subsurface developer did that 10 years ago and it only was because he personally preferred Qt. Take a step back for a moment and consider that in 10 years that's the only major example that anyone ever brings out. GTK is still very welcoming for contributions to maintain the GDK backends. Developers like that have to actually step up and do it and have patience, instead of outright quitting and running off to Qt which has a whole company to maintain those ports.
  14. This comment is pretty wrong. Vulkan still uses the GLSL compiler.
  15. You shouldn't use raw X11 or raw Wayland unless you're writing a low-level toolkit. If you're working on games, SDL should handle all that stuff for you.
  16. Keyboard/mouse sharing is completely unrelated to the Wayland protocol. Wayland is only concerned with sending input events to client windows. Generating and capturing global events is out-of-scope and it's an entirely different API. The way this works in X11 is a giant hack that requires multiple extensions and the end result is it compromises all security of those devices. It's even more delusional to pretend this was ever production-ready or that Wayland needs to be ready for anything here. The X11 implementation just shouldn't have been shipped at all.
  17. I realize it's probably a waste to say this to someone with your username, but getting angry at the situation is futile. You shouldn't use Linux if you're not used to random stuff changing and breaking by now and you're not comfortable adapting to those changes. Doubly so for a rolling release distro like Arch. X was obsolete and a security disaster last decade, holding onto it for another decade is just masochism. If this all is to much trouble for you to run a Unix-like desktop and keep it updated, there's always MacOS. They never even made the initial mistake of using X.
  18. GIMP has about 3-4 part-time developers and no designers. They have no resources to redesign the user interface even though it's been wanted for a long time. It's taken them an extremely long time just to get GIMP 3 out the door and that's just a port without any major UI changes. But I agree otherwise, the horrible name is completely on them.
  19. No, that's an outlandish conspiracy theory and completely ahistorical. GTK was always developed on Linux first, and before it was used by GNOME it had a lot of GIMP-specific functionality that didn't extend well to other apps. Want to know why? Because GIMP and GNOME developers were the only ones contributing. Those "diverse desktop environments" almost always took from GNOME and contributed very little back. That's fine to do it but they need to accept that they don't call the shots when they do that. They don't get to pull their funding and then complain someone else is being a bad custodian, it doesn't work like that.
  20. Key word here being "might". What actually gets displayed is highly dependent on the performance of the program itself and will manifest as wild stuttering depending on small variations in the scene.

    I've seen no game consoles that allow you to turn vsync off, because it would be awful. No idea why this placebo persists in PC gaming.

  21. The average human reaction time is 250ms. The amount of latency you'd save from that on average is unnoticeable, and in exchange you get the appearance of stuttering and corruption from the tearing.
  22. Yes you can, what you would do is clip against the main framebuffer and leave the window area transparent by inverting the mask. Then you can have the other plane underneath the main one. This is only if your hardware supports that though.
  23. The server doesn't have enough information to do it because it doesn't know anything about window management. At best it could use a heuristic to guess, but one of the reasons Wayland combines the server and the compositor is so that we don't need to pile heuristics into the X server and drivers any more.

    Nothing in Xorg is a minor change because it has to be tested with every window manager and compositor before you can even think about merging.

  24. If you're writing a video player or a game or something else that wants direct scan-out, then you can disable the round corners in your CSS.
  25. >There is, or we would not be having subsurfaces on Wayland or this entire discussion in the first place.

    No. The subsurfaces in Wayland are only designed for two things:

    1. Direct scan-out, as in TFA. (Because a subsurface can be directly translated to a dmabuf)

    2. Embedding content from one toolkit/library into another. (Because without it, lots of glue code would be needed)

    It's discouraged to use them otherwise, they would complicate things for no benefit.

    >Are you seriously arguing that the only reason to using windows in Xorg is to have composition?

    If you mean sub-windows, yes, that and consequently because of the way that XRDB worked. I don't see why you would ever use them just for input events within the same toolkit, they don't do anything special there.

  26. >I am using through Xorg a program which uses a frigging colormap

    That doesn't change what I said. The colormap is mapping indexes to RGB palette entries and technically doesn't even support any other color spaces. Nothing about that is "non-RGB". The other visuals are just various other ways to do RGB. If this isn't making sense to you, think about how this would be implemented in the driver.

    >None of this is about the GPU, but about about directly presenting images for _hardware_ composition using direct scan-out

    I'm sorry? What do you suppose is doing the direct scan-out to the monitor if not the GPU?

    >As if that prevented any of the extensions done to X11 in the last three decades, including Xv.

    That's not relevant, XV actually does support running over the network as long as you don't use the SHM functions. But regardless, yes, it actually did. The main example being indirect GLX which hasn't been updated in decades because it's not feasible to run it over the network anymore.

  27. 1. The hardware overlay support is implemented on the GPU. It's not "instead of the GPU".

    2. The provided code in Haiku doesn't appear to support overlays.

    3. Yes, it's upsetting that a real API wasn't available for this until recently. But, it was of limited practical use without the entire display pipeline being moved to the GPU and without Wayland being established (X11 never had the API quite like this, classic XV is too limited to do what this is doing)

  28. >The only video filter I have sometimes used is deinterlacing

    I don't know about anything else, but ffmpeg has some deinterlacing filters that run on the GPU. So your one example is a bad one.

    >Anyway, discussing about this is besides the point, and forgive me from the rant above.

    Next time can you please not post the rant? It's not interesting to parse through all that just to get to the point. It's also extremely uninteresting to have this conversation like "well I haven't personally used that so it must not be important". VLC and ffmpeg are used by millions (billions?) of people, so can we at least agree that neither of our own very particular and personal use cases are that important?

    >But the entire point of TFA is to (dynamically) go back to a model where the GPU is _not_ in the middle.

    No? Overlays are entirely driven by the GPU. The entire reason these are performant is because it's zero-copy from a GPU buffer to an overlay.

    >And that model -- sans GPU -- happens to match what Xv was doing and is actually faster and less power consuming than to always blindly use the GPU which is where we are now post-Xv.

    I have no idea what you're talking about. XV (with a driver that supports it) uses the GPU to do the overlays. If you aren't using any filters, this should have the same power consumption as XV.

  29. >colorspace conversion, scaling

    There's a lot more than that. Please consider installing the latest version of VLC or something like that and checking all the available post-processing effects and filters. These aren't "fancy animations" and they're not rotating 3D cubes, they're basic features that a video player is required to support now. If you want to support arbitrary filters then you need to use the GPU. All these players stopped using XV ages ago, on X11 you'll get the GPU rendering pipeline too because of this.

    I don't really see what's the point of making these condescending remarks like trying to suggest that everyone is stupid and is only interested in making wobbly windows and spinning cubes. Those have never been an actual feature of anything besides Compiz, which is a dead project.

This user hasn’t submitted anything.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal