I have read that GPU hardware has support for sRGB textures. That is, I can have an sRGB image in a texture, and the GPU will convert it to linear RGB when sampling. This should also handle linear interpolation between pixels, and mipmapping. However I don’t know how to make this work, so I started this page to experiment.
Notice the banding! Even when the gradients look the same, the green/purple shows there’s banding. The raw+raw and srgb+srgb work best. In srgb+srgb my debug green/purple visualization shows a hint of banding but the output grays are exactly correct, so there may be some bug in my debug visualization.
1 Background#
For proper lighting and blending, including antialiasing, we need to perform calculations in linear rgb. This means that rgb=0.5 is half the brightness of rgb=1.0. But images and screens don’t use linear rgb. They typically use sRGB. Lots of software—including mine—incorrectly does calculations on the sRGB values and then ends up with the wrong results.
The ELI5 is that the rgb color values are roughly sqrt(brightness). If you want to blend A and B, ((sqrt(A) + sqrt(B)) / 2)^2
is not the same as (A + B) / 2
! The blended brightness will be too low.
To fix this, our shaders should work in linear rgb space. We need to convert srgb to linear when reading images and then convert linear to srgb when writing to the screen.
{{{ if you convert srgb to linear in the shader, texture interpolation will be wrong }}}
2 Input#
When reading a texture we need to convert it from sRGB to linear rgb. There’s a “transfer function” formula on wikipedia[1], and it can be expressed in (Python) code:
X: float = 0.04045 Φ: float = 12.92 A: float = 0.055 Γ: float = 2.4 def srgb_to_linear(value: float) -> float: if value < X: return value / Φ else: return pow((value + A) / (1 + A), Γ) C = srgb_to_linear(X) def linear_to_srgb(value: float) -> float: if value < C: return value * Φ else: return (1 + A) * pow(value, 1.0 / Γ) - A return list(map(srgb_to_linear, [172/255.0, 57/255.0, 57/255.0]))
However, for around 20 years now, GPUs have supported this directly in hardware, and DirectX standardized it in 2006. WebGL 1 supports it too, as the EXT_sRGB[2] extension, and WebGL 2 supports it without an extension. So I’d rather use the built-in hardware if I can.
When setting the texture data using gl.texImage2D()
, the internal format is usually gl.RGBA
. Set it to ext.SRGB_ALPHA_EXT
to have the r, g, b channels go through the srgb to linear transfer function.
On this page I tested loading the gradient texture with gl.RGBA
(rows 1, 3) and ext.SRGB_ALPHA_EXT
(rows 2, 4).
3 Output#
The output needs to be sRGB. We can do that by using the linear_to_srgb()
function I posted above. However, GPUs have supported this in hardware too.
I was unable to find a way to access this in WebGL 1. And it looks like Three.js concluded the same[3].
In WebGL 1 I can write to a framebuffer backed by an SRGB texture. However, when I try to draw from that framebuffer, it converts back to linear. For this test I ended up calling linear_to_srgb()
in the shader.
In WebGL 2 I might be able to use gl.blitFramebuffer()
to copy the SRGB texture to the screen without converting back to linear. I need to test this. I’d have to create another framebuffer with the READ set to the texture and the DRAW to null.
4 Precision#
The next problem is that any intermediate steps that save color data in a framebuffer should be using linear rgb in those framebuffers. But if we store linear rgb values in the framebuffer, we lose precision[4]. So we don’t want to actually store linear rgb in intermediate textures. Let’s test this with 0–255 inputs:
from index import srgb_to_linear, linear_to_srgb def test_roundtrip(input): intermediate = round(255.0 * srgb_to_linear(input / 255.0)) output = round(255.0 * linear_to_srgb(intermediate / 255.0)) if output != input: return f"<b>{output:3}</b>" else: return f"<i>{output:3}</i>" print('<div class="roundtrip-results">' + ''.join([test_roundtrip(input) for input in range(256)]) + '</div>')
“When you do things right, people won’t be sure you’ve done anything at all.” —Bender on Futurama
We can see here that if we start with 0–255 sRGB inputs and store it in linear format, rounded to 0–255, the low values lose a lot of precision. This corresponds to row 2 in the graphical test at the top of the page.
I also wanted to test the other way around:
from index import srgb_to_linear, linear_to_srgb def test_roundtrip(input): intermediate = round(255.0 * linear_to_srgb(input / 255.0)) output = round(255.0 * srgb_to_linear(intermediate / 255.0)) if output != input: return f"<b>{output:3}</b>" else: return f"<i>{output:3}</i>" print('<div class="roundtrip-results">' + ''.join([test_roundtrip(input) for input in range(256)]) + '</div>')
Here, we lose lots of precision at high values. This corresponds to row 3 in the graphical test at the top of the page.
One solution is to store intermediate values with 16-bit values instead of 8-bit values, maybe the gl.RGBA16F
format from EXT_color_buffer_float
?
Another solution is to store it in sRGB even though the shaders are outputting linear rgb values. This corresponds to row 4 in the graphical test.. That’s what happens if we make our intermediate framebuffers use sRGB textures.
In the test at the top of the page, rows 1, 2 use gl.RGBA
framebuffer textures, and rows 3, 4 use ext.SRGB_ALPHA_EXT
textures. The results match what I was hoping for: using no conversion (row 1) or both conversions (row 4) preserve precision. Using one conversion but not the other (rows 2, 3) loses precision.
5 WebGL 2#
I was trying to get the sRGB framebuffer to output to the screen without it getting converted back to linear. I tried using WebGL 2’s gl.blitFramebuffer()
(top half) and gl.copyTexImage2D()
(bottom half), but no luck with either. Very possible my code has a bug. I wasn’t able to find much information about this online.
6 Appendix: monitor calibration#
While trying to understand srgb, I ran across this image on wikipedia, and it did not look like it was supposed to:
https://en.wikipedia.org/wiki/SRGB#/media/File:Srgbnonlinearity.png
Each of the three columns should have the sides and center look the same shade brightness, if you unfocus your eyes. Mine did not. I realized that I have my monitor set to a scaling mode which is probably affecting all of my font rendering experiments. Even integer scaling modes may show this problem. Oops.
7 Conclusion#
Images are in sRGB mode. For textures that represent images, set the texture format to ext.SRGB_ALPHA_EXT
to have the GPU translate this into linear rgb. Then run the shader pipeline in linear rgb, using ext.SRGB_ALPHA_EXT
for intermediate framebuffers. Then convert linear rgb back into srgb at the very end, using a shader function (still looking for a way to do this in hardware though).
- WebGL 1 code: srgb-webgl-1.js
- WebGL 2 code: srgb-webgl-2.js
More reading:
https://entropymine.com/imageworsener/srgbformula/[5] - srgb standard rounded off some numbers
https://en.wikipedia.org/wiki/Transfer_functions_in_imaging[6] - TVs and monitors use different gamma curves(!)