Skip to content

Blend Factor Specification in Shaders #102366

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

LunaticInAHat
Copy link
Contributor

@LunaticInAHat LunaticInAHat commented Feb 3, 2025

This PR augments shaders to allow explicit specification of which blend factors should be used for the material, and adds a directive to the shading language accordingly (blend_factors). These blend factors are expressed in terms that are already familiar to RenderingDevice, but lowercased to fit in with other identifiers in the shading language (e.g., src_alpha, one_minus_src_alpha, dst_color, etc.)

The current paradigm for specifying blend modes (render_mode) provides only the handful of most commonly-used blend modes, but the underlying graphics APIs support thousands of possible blend modes (4 fields, 19 possible values in each position = many thousands of options). Some applications would wish to use some of those blend modes which are not currently accessible to shaders (e.g., emulation/recreation of old games).

Additionally, direct support for specifying blend modes will provide more flexibility when using premultiplied alpha. It would also allow users to force material opacity, for shaders which don't currently support blend_disabled.

Finally, this would bring our blending flexibility much closer to what is supported by other engines, which would make it easier for developers using those engines to transition their materials to Godot. This Unigine documentation of the feature gives some nice examples of effects that can be achieved by combining various blend factors.

The underlying blend factors are already supported in RenderingDevice; this PR just allows shaders to explicitly specify the quartet that they want.

This does not seek to replace or remove the existing blend mode specifications in render_mode for two reasons:

  • For basic usage, the existing blend_* specifications are perfectly sufficient and much more user-friendly
  • Users still need a way to express which blend equation / blend operation they want (ADD, REVERSE_SUBTRACT, MIN, MAX, etc.), and the blend_* specifications seem perfect for that

This feature is very straightforward to add to the RenderingDevice-based renderers, with minimal intrusion. Adding it to the Compatibility renderer is a bit more intrusive, but these blend modes have been supported since ancient, fixed-function OpenGL days, so there was no reason not to add them to the GLES3 renderer.

@milesturin
Copy link

This PR fixes an important issue in my game that has no viable workaround. I'd love to see this in 4.5!

Copy link
Member

@Calinou Calinou left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tested locally (rebased on top of master 6b5b84c), it works as expected.

Testing project: test_pr_102366.zip

advanced_blend_modes.mp4
Forward+ Mobile Compatibility
Image Image Image

Some of the materials in the testing project look noticeably different when using Compatibility. I'm not sure if this is a bug or a limitation or what's available in Compatibility, so please check this.

Some feedback:

  • There's no autocompletion for blend_factors options, which impacts usability.
  • If you specify less than 4 arguments (or more than 4), you'll get a cryptic error message:

image

Instead, a dedicated error should be shown to mention that blend_factors accepts exactly 4 arguments separated by commas.

@LunaticInAHat LunaticInAHat force-pushed the shader_blend_modes_upstream branch from ea74d67 to 41cec58 Compare March 25, 2025 13:13
@LunaticInAHat
Copy link
Contributor Author

I have added a dedicated error message for syntax errors in blend_factors, and have fixed autocompletion.

As near as I can tell, the differences in visual results between the Compatibility renderer and the other renderers stem from differences in how the renderers handle the scene. In every case where there is a baseline to compare against, explicitly specifying blend factors results in the expected behavior. That is, explicitly specifying the factors for blend_mul gives identical visual results to using blend_mul, and so on for the other render modes.

To describe the differences that I believe I have identified:

First, there appear to be differences in how the renderers handle lighting & postprocessing on the scene. Setting all the shaders as render_mode unshaded and disabling tonemapping fixes a couple of the materials that looked different between renderers. It's difficult for me to trace exactly why the lighting gives such different results between the renderers, but it may easily just be an artifact of the GLES3 backend not supporting HDR (as weird blend modes can often result in color components outside the 0-1 range).

Second, there appears to be some postprocessing step present in the Compatibility renderer (after the scene is drawn), which appears to alpha-blend the rendered image over something else. This can be observed by creating a material with blend_factors one, zero, zero, zero and setting ALPHA = 1.0. That should result in the object's color being rendered into the colorbuffer. The alpha channel will have a 0 in it, but RGB should be right. And that is indeed what the Forward+ and Mobile renderers give. The Compatibility renderer produces a flat gray object:
102366_001
I haven't been able to identify where the gray comes from; it's not the clear color. Change the blend factors to one, zero, one, zero and suddenly the object appears how it should:
102366_003
So there is something later on in the Compatibility pipeline that cares about the alpha channel value that ends up in the framebuffer. Maybe a blit that's copying out of the render target, into the visible framebuffer? Changing to blend_factors one, one, one, zero and allowing the texture's alpha channel to come through also suggests that some kind of alpha-blended postprocessing step is going on:
102366_002
Notice how, even though we are adding src color and dst color together, portions of the mesh which ended up with a low alpha value are darker than the destination color was. That can't be coming out of the blend step, something else has to be doing that later.

Finally (and I am not completely confident on this one yet) when clearing the screen / drawing the sky, the compatibility renderer initializes the alpha channel of the framebuffer to 1.0, while the other renderers appear to set it to 0. As a result, any blend factor that uses dst_color, dst_alpha, or src_alpha_saturate experiences different results between the renderers. one_minus_dst_alpha results in 0, because the destination alpha was already 1, etc.

@Calinou
Copy link
Member

Calinou commented Mar 25, 2025

The Compatibility renderer produces a flat gray object:

It's rendering the editor background color (#313743, the same color as you see just above the 3D viewport). This is likely because the viewport is allowed to be transparent in Compatibility, but not in the other rendering methods.

@LunaticInAHat
Copy link
Contributor Author

Ahh, good call. Then that explains why the final alpha value of the framebuffer is so important, and why it's initially set to 1.0 when it gets cleared.

That being the case, I would say that the PR is working as expected, as far as I can tell. The blending operation is producing the output color & alpha values it is directed to produce.

@LunaticInAHat
Copy link
Contributor Author

Just checking in; are there outstanding issues here that I need to be resolving? The PR appears to be producing the right images in the framebuffer; there are just differences in how the framebuffer's alpha channel goes on to be treated, by the different renderers.

@LunaticInAHat LunaticInAHat force-pushed the shader_blend_modes_upstream branch from 41cec58 to 31cf95e Compare April 17, 2025 21:46
@LunaticInAHat
Copy link
Contributor Author

Is there anything else that I need to address, with this PR?

@Calinou
Copy link
Member

Calinou commented May 11, 2025

Is there anything else that I need to address, with this PR?

It's good on my end, but it needs a review from another rendering maintainer before it can be merged.

@beicause
Copy link
Contributor

@LunaticInAHat
Copy link
Contributor Author

I saw this supports BlendFactor, does it support BlendOperation ?

No, this PR only deals with BlendFactor. Adding support for more BlendOperation modes is being at least partially addressed by other PRs (e.g., PR #48654).

@beicause
Copy link
Contributor

#48654 just adds two render mode presets, not custom blend operation.

I wonder if it will work if we add blend_operations directly like the blend_factors.

@LunaticInAHat
Copy link
Contributor Author

#48654 just adds two render mode presets, not custom blend operation.

Those "render modes" primarily just end up setting the blend equation. Once this PR and that one were merged, I think the only missing flexibility would be having different blend equations for RGB vs Alpha.

I wonder if it will work if we add blend_operations directly like the blend_factors.

Sure, it would work. I'm not going to try to cram that into this PR, but sure, a follow-on could allow direct specification of blend operations. That said, I think the motivation for doing so might be somewhat harder to demonstrate; I can't think of a time when I've ever seen a technique that relied on having different blend equations for RGB than Alpha. I'm sure they exist, but I couldn't point one out to justify why we should go changing the renderer.

@LunaticInAHat LunaticInAHat force-pushed the shader_blend_modes_upstream branch from 31cf95e to 0d8c58c Compare June 7, 2025 13:03
@LunaticInAHat
Copy link
Contributor Author

Rebased to resolve merge conflicts

@LunaticInAHat LunaticInAHat force-pushed the shader_blend_modes_upstream branch from 0d8c58c to f25991f Compare June 18, 2025 21:11
@milesturin
Copy link

Anything we can do to get this reviewed by a second rendering maintainer?

@clayjohn
Copy link
Member

To be blunt, I don't see any path towards this being merged. It looks well done and LunaticinAHat has clearly done excellent work.

But exposing every blend operation totally goes counter to Godot's philosophy of keeping things simple. It exposes the entire range of blend modes from the underlying graphics APIs directly to users. There are thousands of possible blend modes and fewer than a dozen are actually useful.

What we should be doing is exposing the useful blend modes directly that people need (like what we did recently with the premultiplied alpha blend mode). This keeps Godot shaders simple and manageable for users.

Users that know what they are doing and want to play around with the full range of options exposed by the graphics API can use the RenderingDevice API directly. But GDShaders should remain simple easy to understand for the majority of users who don't need that complexity.

Our best practices make this quite clear as well
https://docs.godotengine.org/en/latest/contributing/development/best_practices_for_engine_contributors.html#cater-to-common-use-cases-leave-the-door-open-for-the-rare-ones

@LunaticInAHat
Copy link
Contributor Author

To be blunt, I don't see any path towards this being merged. It looks well done and LunaticinAHat has clearly done excellent work.

Thank you, and thank you for being clear about your perspective. I'm not entirely surprised to hear this, but I have a few counterpoints I'd like to present:

But exposing every blend operation totally goes counter to Godot's philosophy of keeping things simple. It exposes the entire range of blend modes from the underlying graphics APIs directly to users. There are thousands of possible blend modes and fewer than a dozen are actually useful.

What we should be doing is exposing the useful blend modes directly that people need (like what we did recently with the premultiplied alpha blend mode). This keeps Godot shaders simple and manageable for users.

First, I disagree to a certain degree about how narrow the set of blend modes that "people need" is. When you are authoring new content, and aren't trying to reproduce any particular aesthetic from the past, it's probably true that there are only a handful of blend modes that matter. But some users (including myself) use Godot to try to recreate old games. And there is an era of gaming where artists were using lots of weird blend modes. I am pretty confident that I (and those other users) would never succeed in trying to convince you to add specific support for those oddball modes, so I wanted to pursue this feature because the alternative is to not be able to faithfully recreate the look of the old games at all.

Second, we just added support for stencil-buffer operations to GDShaders (#80710). When we did, we didn't just implement support for a couple of usecases that we thought were most common. We exposed the stencil buffer -- operation, compare value, write mask. This PR is just doing the same thing for blend modes. I haven't removed any of the "canned" blend modes, and I'm not pushing in that direction; as I said in my PR description, I think the canned modes should serve the vast majority of needs. This PR adds a new capability, but if people can't grok it, they can just ignore it, just like the stencil stuff.

Users that know what they are doing and want to play around with the full range of options exposed by the graphics API can use the RenderingDevice API directly. But GDShaders should remain simple easy to understand for the majority of users who don't need that complexity.

Our best practices make this quite clear as well https://docs.godotengine.org/en/latest/contributing/development/best_practices_for_engine_contributors.html#cater-to-common-use-cases-leave-the-door-open-for-the-rare-ones

As far as I was able to tell, when researching before implementing this PR, the door to rare use cases is actually pretty much shut, in this area. Maybe you can point me at something I missed, but I was entirely unable to find a way to hook anything material-/or shader-esque into the existing renderers, in any way such that I could put pipeline attachments on it, to control the blend modes. It looked very much like my only option, for getting control over blending, was to effectively reimplement the renderer. And that is just setting the bar way too high, for something minor like blend modes, in my opinion. Maybe I misunderstood something, or maybe the docs aren't clear, but I tried going down the RenderingDevice avenue before I ever even considered writing this PR, and I felt like I just ran into a brick wall.

Finally, I'd like to make the point that I think this PR provides a useful and extensible framework for implementing additional blend modes in the future. Even if you disagree with my above points, and want me to drop the blend_factors keyword from the shading language, I think you should still consider merging the rest of the PR, because it gives you an easy, fairly-unified interface for communicating your precanned blend modes into the renderer.

That is, rather than having a render-time (in the case of the GLES3 renderer, at least) switch in the core of the renderer, which has to know about all of the various high-level blend modes, you can have all of the "this is the set of blend modes we support" stuff happening in the shader compiler, which then just plops blend factors into the materials, and the renderers just draw the materials.

Then, when you have to add another new blend mode in the future, rather than having to do work inside of each of the renderers, the work is limited to the shader compiler, expanding the set of blend factors that the user is capable of expressing to it (which it then places into the materials).

If you're interested in moving in that direction, I'd be happy to pull the blend_factor keyword out of this PR, and adjust the existing blend modes to use the "blend factor" interface for communicating to the renderer. Just getting the in-engine capability for shaders to freely control blending will reduce the size of the patch that I (and other users) would have to maintain, in order to support free user-specified blend modes, and I think it will reduce future maintenance overhead for the rest of the project.

@QbieShay
Copy link
Contributor

QbieShay commented Jul 6, 2025

I want to first of all thank you for spending time to supporting this feature. As one of the people that pushed premul alpha forward, i understand the need for more of what Godot exposes.

However, this is not really the process that we do to include features in the engine.
We normally go through a proposal process so that we may discuss API and take time to talk about the needs that aren't currently met by the engine. In particular, we really need to spend time considering what are the consequences of exposing certain things to the users and how they interact with the broader engine.
Once API is exposed, it cannot be taken back.

To expand on what clay said, we don't do speculative features, not even for the sake of getting feature parity with other engines.
While I understand this is a philosophy people may not share, this is what has kept Godot relatively small and somewhat maintainable and it's a core principle by which we develop.

Please @milesturin and/or @LunaticInAHat take time to detail what artistic approach you're currently unable to achieve due to lack of exposed blend functions, that's what may move this forward.

Second, we just added support for stencil-buffer operations to GDShaders (#80710). When we did, we didn't just implement support for a couple of usecases that we thought were most common. We exposed the stencil buffer -- operation, compare value, write mask.

This is not how stencil has been exposed. We didn't ship masks or incr/decr operations yet precisely because of this broader discussion.

If you're interested in moving in that direction, I'd be happy to pull the blend_factor keyword out of this PR, and adjust the existing blend modes to use the "blend factor" interface for communicating to the renderer. Just getting the in-engine capability for shaders to freely control blending will reduce the size of the patch that I (and other users) would have to maintain, in order to support free user-specified blend modes, and I think it will reduce future maintenance overhead for the rest of the project.

@clayjohn want to pitch in on this specifically?

@LunaticInAHat
Copy link
Contributor Author

LunaticInAHat commented Jul 9, 2025

However, this is not really the process that we do to include features in the engine. We normally go through a proposal process so that we may discuss API and take time to talk about the needs that aren't currently met by the engine. In particular, we really need to spend time considering what are the consequences of exposing certain things to the users and how they interact with the broader engine. Once API is exposed, it cannot be taken back.

This PR is an implementation of Proposal 7058, which I had linked in the PR description, but perhaps I didn't make the relationship clear enough. The implementation differs in a couple of details from that proposal (choice of keyword, separate color vs alpha factors), but that proposal was the inspiration for my work. Perhaps I should have commented on that proposal, documenting my reasons for believing these alterations were necessary. No-one from the rendering team had popped into that proposal to weigh in, so it didn't look like it was being paid attention to, and I figured it would stand a better shot of acceptance if there was real code backing it up. If your process is to carry the rest of the discussion on about what the right API for the feature is, in that proposal, that's fine by me.

To expand on what clay said, we don't do speculative features, not even for the sake of getting feature parity with other engines. While I understand this is a philosophy people may not share, this is what has kept Godot relatively small and somewhat maintainable and it's a core principle by which we develop.

I would say that there isn't too much "speculation" about the need for more blend modes. It's been a capability of graphics cards that game engines have been exposing since the 90s, and there are plenty of users who want to use additional blend modes beyond what is currently available (like this proposal, plus the ones I already linked in my PR description). The usecases and exact blend modes that they want are often different, but the common theme of "we need to specify different blend modes" remains.

The API we use to expose this is plenty open to debate, sure. Is it a keyword in the shader? Is it a function we call on the Shader object? Is it specified on the Material? I'm happy to discuss all those possibilities; I'm married to the feature, not this specific implementation of it.

It can't be specified at the MeshInstance3D level or above, because there may be multiple surfaces with different blend modes, additional passes with different blend modes, etc.

My motivation for specifying it within the shader program text was roughly this:

  • It was what I saw in that proposal
  • Pixel blending interacts pretty intimately the shader's output, and is reliant upon the render_mode for setting the blend equation / operator, so it makes sense to specify in the same place

My motivation for allowing direct user specification of the four blend factors was largely because I was daunted by the prospect of trying to come up with descriptive, meaningful names that would fit in a render_mode declaration. See the list of blend modes I need to support, below, and I think you will understand.

Please @milesturin and/or @LunaticInAHat take time to detail what artistic approach you're currently unable to achieve due to lack of exposed blend functions, that's what may move this forward.

Sure. Understand that I am not an artist: I am a programmer, seeking to re-engineer classic games. At a high level, what I am currently trying to achieve (and am unable to, without finer control of blend factors), is to be able to render assets from World of Warcraft, and have them look right. WoW uses (at least) the following set of blend modes:

src_alpha, one_minus_src_alpha, one, zero                         // Similar to blend_mix, but note different blending of alphas
src_alpha, one_minus_src_alpha, one, one_minus_src_alpha  // blend_mix
src_alpha, one, zero, one                                                       // Similar to blend_add, but note different blending of alphas
dst_color, zero, dst_alpha, zero                                            // blend_mul
dst_color, src_color, dst_alpha, src_alpha
dst_color, one, dst_alpha, one                                              // Similar to blend_mul, but note different blending of alphas
one_minus_src_alpha, one, one_minus_src_alpha, one
one_minus_src_alpha, zero, one_minus_src_alpha, zero
src_alpha, zero, src_alpha, zero
one, one, zero, one                                   // Similar to blend_add if material doesn't go down transparent path, but note different blending of alphas
constant_alpha, one_minus_constant_alpha, constant_alpha, one_minus_constant_alpha
one_minus_dst_color, one, one, zero
one, one_minus_src_alpha, one, one_minus_src_alpha

I can't justify or defend most of those modes. I don't know why they chose those specific modes. Quite a few of them are similar to existing modes, but differ in how they treat the alpha channel, so mis-blending only shows up when multiple translucent objects are overlapping, which makes it difficult to find and showcase examples for you.

But, to try to help you see where I'm coming from, here is an example of one of the more obvious modes in action: dst_color, src_color, dst_alpha, src_alpha is known as "Modulate2x", and it effectively doubles the result of a multiply blend. Here is an example asset, which uses this blend mode (on the saddle rump and mane):
tpbm027

Here is what it looks like, with a regular blend_mul:
tpbm028

Notice how much darker the affected surfaces are.

Second, we just added support for stencil-buffer operations to GDShaders (#80710). When we did, we didn't just implement support for a couple of usecases that we thought were most common. We exposed the stencil buffer -- operation, compare value, write mask.

This is not how stencil has been exposed. We didn't ship masks or incr/decr operations yet precisely because of this broader discussion.

A fair point. I would say that, despite the holdout of those features (at least for now), users are given significantly more nuanced control of the stencil buffer than they are of blend modes.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Expose more OpenGL blend modes as shader render_mode
7 participants