Preferences

No surprise here, given the extent HLSL is already the de facto shading language for Vulkan.

Khronos already mentioned in a couple of conferences that there will be no further work improving GLSL, and given DirectX weight in the industry, HLSL kind of took over.

Additionally for the NVidia fans, it might be that Slang also gets a place in the Vulkan ecosystem, discussions are ongoing, as revealed on SIGGRAPH sessions.


My understanding was that dxc lacked support for compiling various HLSL features to SPIR-V (hence SM7 now), so there are still a bunch of Vulkan-focused projects like Godot which only support GLSL.

But yes, the games industry has been almost entirely HLSL since forever, and this is going to help remove the final obstacles.

Yep, especially DXC HLSL to SPIRV was a big issue when it came to supporting new features from Vulkan.

Though I would still like to see if slang can succeed and I am always a bit afraid of Microsoft just dropping the ball somewhere.

What about WGSL though, the shader language of WebGPU? WebGPU is kind of Vulkan lite, but unlike with Vulkan, Apple is on board and actually the reason why WGSL exists as yet another shading language.
What about it? Nobody wanted WGSL, it's just an artifact of having to appease Apple during WebGPUs development as you say. I don't see why it would be adopted for anything else.

The old WebGPU meeting notes have some choice quotes from (IIRC) Unity and Adobe engineers literally begging the committee not to invent a new shader language.

>The old WebGPU meeting notes have some choice quotes from (IIRC) Unity and Adobe engineers literally begging the committee not to invent a new shader language.

This was an interesting tidbit, so I tried to find the source for it. While I did not find it, I did find the December 2019 minutes[0] which has a related point:

>Apple is not comfortable working under Khronos IP framework, because of dispute between Apple Legal & Khronos which is private. Can’t talk about the substance of this dispute. Can’t make any statement for Apple to agree to Khronos IP framework. So we’re discussing, what if we don’t fork? We can’t say whether we’re (Apple) happy with that.

I found this link via rust hn[1] which I found after reading this blog post:[2]

>Vulkan used a bytecode, called SPIR-V, so you could target it from any shader language you wanted. WebGPU was going to use SPIR-V, but then Apple said no

The lobsters thread also links to a relevant HN post:[3]

>I know, I was there. I also think that objection to SPIR-V wasn't completely unfounded. SPIR-V is a nice binary representation of shaders, but it has problems in the context of WebGPU adoption: It's so low level [...] It has a lot of instructions [...] Friction in the features we need, vs features Khronos needs. [...] there is no single well specified and tested textual shading language. HLSL doesn't have a spec.

The linked blog post from lobsters was also discussed on HN, which you also commented in.[4]

It would be great if you could find that Unity/Adobe discussion as I would be interested to read it.

[0] https://docs.google.com/document/d/1F6ns6I3zs-2JL_dT9hOkX_25...

[1] https://lobste.rs/s/q4ment/i_want_talk_about_webgpu

[2] https://cohost.org/mcc/post/1406157-i-want-to-talk-about-web...

[3] https://www.hackerneue.com/item?id=23089745

[4] https://www.hackerneue.com/item?id=35800988

> It would be great if you could find that Unity/Adobe discussion as I would be interested to read it.

https://github.com/gpuweb/gpuweb/wiki/Minutes-2019-09-24

Corentin: Web Shading Language — A high-level shading language made by Apple for WebGPU.

<room in general grimaces>

[...]

Jesse B (Unity): We care about HLSL

Eric B (Adobe): Creating a new high level language is a cardinal sin. Don’t. Do. That. Don’t want to rewrite all my shaders AGAIN.

Jesse B: If we can transcode to HLSL to whatever you need, great. If we can’t, we may not support your platform at all.

Eric B: Would really not like even to write another transcoder. If there’s an existing tool to get to an intermediate representation, that’s good. Would suggest SPIRV is an EXCELLENT existing intermediate representation.

Note the WSL language made by Apple which sparked that discussion is unrelated to the WGSL language they ended up shipping, but the sentiment that the ISV representatives just wanted them to use HLSL or SPIR-V stands.

>WSL

Ah, that explains part of why I couldn't find it. I was searching mainly for WGSL. Something Like 'WEBGPU minutes "Unity" "HLSL" "WGSL"'. There was also WHLSL also from Apple at one point but was later droppped in favor of WSL.[0][1]

>A few months ago we discussed a proposal for a new shading language called Web High-Level Shading Language, and began implementation as a proof of concept. Since then, we’ve shifted our approach to this new language, which I will discuss a little later in this post.

>[...]

>Because of community feedback, our approach toward designing the language has evolved. Previously, we designed the language to be source-compatible with HLSL, but we realized this compatibility was not what the community was asking for. Many wanted the shading language to be as simple and as low-level as possible. That would make it an easier compile target for whichever language their development shop uses.

>[...]

>So, we decided to make the language more simple, low-level, and fast to compile, and renamed the language to Web Shading Language to match this pursuit.2

The "we designed the language to be source-compatible with HLSL, but we realized this compatibility was not what the community was asking for" comment is funny because Unity's "We care about HLSL" comment seems to be directly against this.

In any case, this is really a disappointing move from Apple. Just another example of them ignoring developers – even large developers like Adobe and Unity – over completely petty disputes and severe NIH.

The craziest line in the post is probably "[WSL] would make it an easier compile target for whichever language their development shop uses." It's like they knew people wanted SPIR-V but they wouldn't do it due to some petty legal drama that Apple invented and then chose literally the worst of all worlds by making yet another compile target instead of at least choosing the next best thing which would be something that is compatible with HLSL.

[0] https://github.com/w3c/strategy/issues/153

[1] https://webkit.org/blog/9528/webgpu-and-wsl-in-safari/

> it's just an artifact of having to appease Apple during WebGPUs development

To appease Google most likely. WebGPU is based on original work by Apple and Mozilla which based it on Metal.

I doubt Apple would be against whatever Metal uses for its shader language.

The choice was between using or adapting SPIR-V, which is what basically everyone doing multi-platform development wanted, or using anything else and pissing everyone off by making them support another shader language. Apple stonewalled using SPIR-V or any other Khronos IP on unspecified legal grounds so they effectively forced the decision to roll a new shader langage, post-hoc rationalizations were given (e.g. human readable formats being more in the spirit of the web despite WebAssembly already existing at that point) but the technical merits were irrelevant when one of the biggest stakeholders was never ever going to accept the alternative for non-technical reasons.

https://docs.google.com/document/d/1F6ns6I3zs-2JL_dT9hOkX_25...

Apple is not comfortable working under Khronos IP framework, because of dispute between Apple Legal & Khronos which is private. Can’t talk about the substance of this dispute. Can’t make any statement for Apple to agree to Khronos IP framework. So we’re discussing, what if we don’t fork? We can’t say whether we’re (Apple) happy with that.

I don't understand why people say things that are kind of trivial to disprove, but here's the document with the notes where Apple refuses to use SPIR-V.

https://docs.google.com/document/d/1F6ns6I3zs-2JL_dT9hOkX_25...

> MS: Apple is not comfortable working under Khronos IP framework, because of dispute between Apple Legal & Khronos which is private. Can’t talk about the substance of this dispute. Can’t make any statement for Apple to agree to Khronos IP framework. So we’re discussing, what if we don’t fork? We can’t say whether we’re (Apple) happy with that.

Reading between the lines, it seems like Apple mainly doesn't want to implement SPIR-V because engaging with the "Khronos IP framework" would prevent them from suing other Khronos members over patent disputes.

WebGPU, like WebGL, is a decade behind the native APIs it is based on.

No one asked for a new Rust like shading language that they have to rewrite their shaders on.

Also contrary to FOSS circles, most studios don't really care about Web 3D, hence why streaming is such a thing for them.

There have been HLSL to SPIR-V compilers for several years now, this is Microsoft own official compiler getting SPIR-V backend as well.

Because WebGL, just like WebAssembly (with its hacky thread support and compilation issues) is a giant kludge.

WebGL still has fundamental issues of not even supporting anything resembling a modern OpenGL feature set (with modern meaning 2010s era stuff like compute shaders and multi draw indirect) in theory, and in practice, macOS doesn't support WebGL2, meaning stuff like multiple render targets (which is necessary for deferred rendering), so it's almost impossible to make a modernish game that runs in a browser well.

Imo the problem isn't that WebGPU/Wasm is a decade/X years behind, but that we cannot reliably expect a feature set that existed on typical mid 2000s PCs to reliably work in the browser across all platforms (which is the whole point of the web).

It's almost as like some Fruit based company is sabotaging the efforts to keep its walled garden.
Despite all the bending over backwards to keep the fruit company on board with WebGPU, they still haven't actually shipped their Metal backend in Safari over a year after Chrome managed to ship DirectX, Metal and Vulkan backends simultaneously. Mozilla hasn't shipped WebGPU either but their resources can hardly be compared to Apples.
Honestly Google's probably almost as guilty - Native Client was a great idea and sidestepped basically all the issues we are having now, but they killed it in favour of 'standard' APIs, like Wasm that basically barely work for their intended purposes
> macOS doesn't support WebGL2

WebGL2 is fully supported in Safari since quite a while now. In fact it's using the same rendering backend as Chrome and Firefox (ANGLE), and AFAIK Google and Apple engineers worked together to create (or at least improve?) the ANGLE Metal backend and integrate ANGLE into Safari.

Safari supports WebGL2 since version 15 - unless you meant something else by macOS lacking support?

(I agree with your general point though.)

The native WebGPU libraries accept SPIRV as input, and they offer libraries to convert WGSL to SPIRV and back. E.g. WGSL is only needed when running WebGPU in browsers, but even there it can be code-generated from other shading languages by going through SPIRV (but tbh, I actually like WGSL, it's simple and straightforward).
Except that the conversion to WGSL is a complete waste of compute resources, engineering effort and the time of everyone involved. WebGPU is a _web_ API after all, even if people realized the runtimes could be used outside the browser.

Converting your SPIR-V to WGSL just to convert it back to SPIR-V to feed it into a Vulkan driver, or running an entire language frontend just to emit DXIL or Metal IR. We learned 15 years ago that textual shader languages at the GPU API interface are a mistake but we're forced to relearn the same mistakes because Apple wouldn't play ball. What a joke.

WGSL was a mistake and hopefully they get rid of it, it negatively impacts WebGPU's adoption, at least it did for me, the syntax is one of the worst ever created, just horrible
WGSL could be good for Khronos. It’s a modern language with an actual specification. It’s gaining users every day.
> Khronos already mentioned in a couple of conferences that there will be no further work improving GLSL

Unfortunately, HLSL isn’t an open standard like GLSL. Is it Khronos's intention to focus solely on SPIR-V moving forward, leaving the choice of higher-level shader languages up to application developers?

There's likely to be very little funding for GLSL moving forward, and I would expect no major spec updates ever again, but vendors will probably keep publishing extensions for new GPU features and fixing things up. GLSL still has a fairly large user base. Whether SPIR-V is going to be the only Khronos shading language (or whatever you want to call it) moving forward, that's hard to say. Nvidia is pushing for Slang as a Khronos standard at the moment. Not sure if anyone's biting.
Yes, they officially stated at Vulkanised, SIGGRAPH among other places, that there is no budget for GLSL improvements, and also they aren't programming language experts anyway.

It is up to the community to come up with alternative, and the game development community is mostly HLSL.

Will this help games be more compatible with the proton layer on Linux or is this not related?
In theory if DirectX games start passing shaders to the driver in SPIR-V, the same format Vulkan uses, then yes it should make Protons job easier. Translating the current DXIL format to SPIR-V is apparently non-trivial to say the least:

https://themaister.net/blog/2021/09/05/my-personal-hell-of-t...

https://themaister.net/blog/2021/10/03/my-personal-hell-of-t...

https://themaister.net/blog/2021/11/07/my-personal-hell-of-t...

https://themaister.net/blog/2022/04/11/my-personal-hell-of-t...

https://themaister.net/blog/2022/04/24/my-personal-hell-of-t...

Maybe. Maybe not; it could well be an incompatible flavour of SPIR-V.
It's unlikely to diverge from the same general flavor as vulkan. The worst parts of the DXIL to SPIR-V conversion I remember from that chain of blog posts is rebuilding structured control flow and how it interacts with atomics and wave convergence.

That's a problem that goes away irrespective of any DX extensions to SPIR-V for supporting the binding model DX uses.

I haven't used either in a while, what is missing from GLSL?
C based, no support for modular programming, everything needs to be a giant include, no one is adding features to it as Khronos isn't assigned any budget to it.

HLSL has evolved to be C++ like, including lightweight templates, mesh shaders and work graphs, has module support via libraries, is continuously being improved on each DirectX release.

I'm not a fan of GLSL either, but adding C++ like baggage to shading languages like HLSL and especially MSL do (which is C++) is a massive mistake IMHO, I'd prefer WGSL over that sort of pointless language complexity any day.
Long term shading languages will be a transition phase, and most GPUs will turn into general purpose compute devices, where we can write code like in the old days of software rendering, except it will be hardware accelerated anyway.

We already see this with rendering engines that are using CUDA instead, or as shown at Vulkanised sessions.

I do agree that to the extent C++ has grown, and the security issues, something else would be preferable, maybe NVidia has some luck with their Slang adoption proposal.

At some point you have to stop working in assembly and graduate to a high-level language and beyond.

Modern GPU stuff is getting too complex to be practical without higher language features.

From the pov of assembly, C and any other high level language are basically the same. That doesn't mean that climbing even higher up on the abstraction ladder is a good thing though (especially for performance).

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal