From there each application can draw its own GUI and respond to events that happen in its panes like a mouse button down event while the cursor is at some coordinates and so forth using event capabilities. What any event or the contents of a pane mean to the application doesn't matter to the OS and the application has full control over all of its resources and its execution environment with the exception of not being allowed to do anything that could harm any other part of the system outside its own process abstraction. That's my rationale for why the display system and input events should work that way. Plus it helps latency to keep all of that in the kernel especially since we're doing all the rendering on the CPU and are thus bottlenecked by the CPU's memory bus having way lower throughput compared to that of a discrete GPU. But that's the way it has to be since there are basically no GPUs out there with full publicly available hardware documentation as far as I know and believe me I've looked far and wide and asked around. Eventually I'll want to port Mesa because redoing all the work develop something that complex and huge just isn't pragmatic.