Wayland also doesn't even remotely resemble anything that would be fit to talk to modern graphics hardware. DMA buffers and DRM (direct rendering manager, not the digital restrictions management) are an afterthought, a separate protocol that is not even all in stable yet. Vulkan usually doesn't work. Latency with Wayland got worse and won't get better because "frames have to be perfect". Tons of unnecessary blitting and latency be damned. All while actually tearing in Wayland is as bad as in X11 here.
Fixing X would require a protocol that is mostly X, but of course incompatible because you have to rip out some protocol bugs. But Wayland isn't X minus the bugs. Wayland started as a little bit of broken bitmap-pushing and a whole lot of hot air. And even with tons of extension and auxilliary protocol development, multiplied by tons of unnecessary reimplementations in tons of compositors, it isn't even where X11 was when Wayland started. Wayland fixed nothing yet, broke a lot, fragmented the community, brought pain and misery.
> Wayland also doesn't even remotely resemble anything that would be fit to talk to modern graphics hardware. DMA buffers and DRM (direct rendering manager, not the digital restrictions management) are an afterthought
You don't use Wayland to talk to graphics hardware, you use Wayland to communicate with the display server.
The Wayland protocol lets apps negotiate an area to write it output to and how it gets written there is completely up to the application, whether it involves the GPU or not, OpenGL, Vulkan etc.
This is in contrast to X where the app use X APIs to draw textures, which are then pulled by the compositor (copy, rip latency/performance), and then sent back to the X server to display.
That is complete BS. The application doesn't talk to the graphics hardware alone and then just copy a bitmap into a Wayland buffer. You don't just magically talk to the GPU. There is this little problem called 'security', 'multiprocessing' and 'multiuser' in between.
> That is complete BS. The application doesn't talk to the graphics hardware alone and then just copy a bitmap into a Wayland buffer. You don't just magically talk to the GPU. There is this little problem called 'security', 'multiprocessing' and 'multiuser' in between.
This is literally what DRM/DRI is for...
which is not Wayland.
If you think the display server should handle applications using the GPU, then even Xorg dropped this approach.
DRM is literally the 3D acceleration driver framework for Linux. It has been around for decades and is same set of drivers that are used for any sort of accelerated graphics in X.
If you want to get rid of DRM you have to start over and rewrite all graphical drivers for Linux from scratch.
DMA is part of the basic architecture of modern computers. It is how you can do things like have fast USB devices or network devices because it allows device hardware to by-pass the CPU and write things directly to memory.
> DRM is literally the 3D acceleration driver framework for Linux. It has been around for decades and is same set of drivers that are used for any sort of accelerated graphics in X.
Yes, exactly. And Wayland ignored it for years, from the start, and only later slowly adopted it as an extension.
> DMA is part of the basic architecture of modern computers.
Yes, I know.
DMA is even older and not limited to Linux or Graphics. Back in the old VESA, SGI and Windows 3.0 times, DMA was a cool new feature (but actually old even then). When Wayland was conceived, it was boring and old. Yet Wayland didn't originally include DMA buffers, just later added it as an extension when it became obvious that they had just gotten rid of a 40 year old feature that was really really necessary for modern graphics...
> Yes, exactly. And Wayland ignored it for years, from the start, and only later slowly adopted it as an extension.
DRM is not part of Wayland, and Wayland does not use DRM. Wayland is the protocol between the display server and application, DRM is a functionality provided by the kernel to allow user space applications to use and share graphics hardware.
The display server can use DRM, as will applications wanting to use OpenGL/Vulkan, but these are not "wayland".
Did you even read the use case for this? or just Googled 'Wayland DRM' and post the first link?
When VR headsets are exposed to user space, they appear as displays and subsequently the display server will control them (which isn't useful), this is just a protocol that allows clients (like games, SteamVR) to have control transferred so they can drive the VR headset instead. This is because multiple applications are not allowed to control the same display on the Linux kernel at the same time.
It does not make DRM/DRI part of Wayland. Again, it goes back to my original comment of "you use Wayland to communicate with the display server"
IIRC GNOME originally wanted to do this over Dbus, but there was opposition.
Fixing X would require a protocol that is mostly X, but of course incompatible because you have to rip out some protocol bugs. But Wayland isn't X minus the bugs. Wayland started as a little bit of broken bitmap-pushing and a whole lot of hot air. And even with tons of extension and auxilliary protocol development, multiplied by tons of unnecessary reimplementations in tons of compositors, it isn't even where X11 was when Wayland started. Wayland fixed nothing yet, broke a lot, fragmented the community, brought pain and misery.