Skip to content

glamor: Don't forbid GLES contexts#1853

Open
stefan11111 wants to merge 1 commit intoX11Libre:masterfrom
stefan11111:glamor-on-gles
Open

glamor: Don't forbid GLES contexts#1853
stefan11111 wants to merge 1 commit intoX11Libre:masterfrom
stefan11111:glamor-on-gles

Conversation

@stefan11111
Copy link
Contributor

@stefan11111 stefan11111 commented Jan 24, 2026

If this ever causes problems, we can fix this code then.

The "error case" that this code is trying to prevent happens even when using GL contexts.

From my testing, all this code does is forbid GLES contexts, which work just fine without this filter.

See: https://gitlab.freedesktop.org/xorg/xserver/-/commit/8702c938b33b9ec180d64754eb922515c7c4a98b for the commit that introduced this code.

@stefan11111
Copy link
Contributor Author

@notbabaisyou ping

If this ever causes problems, we can fix this code then.

The "error case" that this code is trying to prevent happens
even when using GL contexts.

From my testing, all this code does is forbid GLES contexts,
which work just fine without this filter.

See: https://gitlab.freedesktop.org/xorg/xserver/-/commit/8702c938b33b9ec180d64754eb922515c7c4a98b
for the commit that introduced this code.

Signed-off-by: stefan11111 <stefan11111@shitposting.expert>
@notbabaisyou
Copy link
Contributor

Which error case was being triggered on a GLES context?

And what hardware was this on?

@stefan11111
Copy link
Contributor Author

Which error case was being triggered on a GLES context?

And what hardware was this on?

There is hardly any info about this in the commit that added this check.
Apparently, it is good to enforce a certain match between formats on GLES.
I don't see any reasoning given for why this should be done for GLES only, as the same format mismatch that is forbidden on GLES is allowed, and does happen, with GL contexts.

@notbabaisyou
Copy link
Contributor

Was rather asking which error you were getting printed in that codeblock...

@stefan11111
Copy link
Contributor Author

stefan11111 commented Jan 25, 2026

Was rather asking which error you were getting printed in that codeblock...

This is with the modesetting driver, with GLES context forced:

$ xinit


XLibre X Server 1.25.1
X Protocol Version 11, Revision 0
Current Operating System: Linux GentooPC 6.16.0-gentoo #7 SMP Sat Sep 20 14:26:24 EEST 2025 x86_64
Kernel command line: BOOT_IMAGE=/vmlinuz-6.16.0-gentoo root=PARTUUID=bf4f2899-01af-f748-ba8d-72b44b1ed0eb ro nvidia_drm.modeset=1
Current version of pixman: 0.46.4
Markers: (--) probed, (**) from config file, (==) default setting,
        (++) from command line, (!!) notice, (II) informational,
        (WW) warning, (EE) error, (NI) not implemented, (??) unknown.
(==) Log file: "/var/log/Xorg.0.log", Time: Sun Jan 25 21:22:12 2026
(==) Using config file: "/etc/X11/xorg.conf"
(==) Using system config directory "/usr/share/X11/xorg.conf.d"
GBM Loader: dlopen(/usr/lib64/gbm/nvidia-drm_gbm.so)
glamor: Implementation returned 0x1908/0x1401 read format/type for depth 24, expected 0x80e1/0x1401.  Falling back to software.
glamor: Implementation returned 0x1908/0x1401 read format/type for depth 32, expected 0x80e1/0x1401.  Falling back to software.
(EE) glamor0: GL error: GL_INVALID_OPERATION error generated.

GLES contexts work fine with this pr, and regular GL contexts also create such mismatch errors if I patch the check to force regular GL contexts them to also be checked.

Btw, glamor's sw fallback is broken.
Pages of this get printed to the terminal, and all drawing is invisible, only the cursor is visible.

Backtrace:
0: X (xorg_backtrace+0x40) [0x55845d9877c0]
1: /usr/lib64/libnvidia-eglcore.so.580.95.05 (0x7f692a3bc000+0x750ed5) [0x7f692ab0ced5]
2: /usr/lib64/libnvidia-eglcore.so.580.95.05 (0x7f692a3bc000+0x24fb13) [0x7f692a60bb13]
3: /usr/lib64/libnvidia-eglcore.so.580.95.05 (0x7f692a3bc000+0x25f8ad) [0x7f692a61b8ad]
4: /usr/lib64/libnvidia-eglcore.so.580.95.05 (0x7f692a3bc000+0x802f18) [0x7f692abbef18]
5: /usr/lib64/libnvidia-eglcore.so.580.95.05 (0x7f692a3bc000+0x80398f) [0x7f692abbf98f]
6: /usr/lib64/libnvidia-eglcore.so.580.95.05 (0x7f692a3bc000+0x8177d8) [0x7f692abd37d8]
7: /usr/lib64/xorg/modules/xlibre-25/libglamoregl.so (0x7f692c782000+0x2a6e5) [0x7f692c7ac6e5]
8: /usr/lib64/xorg/modules/xlibre-25/libglamoregl.so (0x7f692c782000+0x161ed) [0x7f692c7981ed]
9: /usr/lib64/xorg/modules/xlibre-25/libglamoregl.so (0x7f692c782000+0x25f22) [0x7f692c7a7f22]
10: /usr/lib64/xorg/modules/xlibre-25/libglamoregl.so (0x7f692c782000+0x35c47) [0x7f692c7b7c47]
11: X (0x55845d7c9000+0xf8876) [0x55845d8c1876]
12: X (0x55845d7c9000+0x1082fb) [0x55845d8d12fb]
13: X (0x55845d7c9000+0x6bc06) [0x55845d834c06]
14: X (0x55845d7c9000+0x1baa2) [0x55845d7e4aa2]
15: /lib64/libc.so.6 (0x7f692cd74000+0x27037) [0x7f692cd9b037]
16: /lib64/libc.so.6 (__libc_start_main+0x87) [0x7f692cd9b107]
17: X (_start+0x21) [0x55845d7e6b31]

(EE) glamor0: GL error: GL_INVALID_OPERATION error generated.

@notbabaisyou
Copy link
Contributor

This seems to be a NVIDIA specific issue, as they are returning GL_RGBA for GL_IMPLEMENTATION_COLOR_READ_FORMAT instead of what GLAMOR expects, which is GL_BGRA.

You might just need to add a workaround that uses the GL format path instead.

@stefan11111
Copy link
Contributor Author

This seems to be a NVIDIA specific issue, as they are returning GL_RGBA for GL_IMPLEMENTATION_COLOR_READ_FORMAT instead of what GLAMOR expects, which is GL_BGRA.

You might just need to add a workaround that uses the GL format path instead.

You mean something like this?

diff --git a/glamor/glamor.c b/glamor/glamor.c
index a1a9348bf..248c8f987 100644
--- a/glamor/glamor.c
+++ b/glamor/glamor.c
@@ -448,14 +448,15 @@ glamor_format_for_pixmap(PixmapPtr pixmap)
         return &glamor_priv->formats[pixmap->drawable.depth];
 }
 
-static void
-glamor_add_format(ScreenPtr screen, int depth, CARD32 render_format,
-                  GLenum internalformat, GLenum format, GLenum type,
-                  Bool rendering_supported)
+static Bool
+glamor_add_format2(ScreenPtr screen, int depth, CARD32 render_format,
+                   GLenum internalformat, GLenum format, GLenum type,
+                   Bool rendering_supported, Bool dry_run)
 {
     glamor_screen_private *glamor_priv = glamor_get_screen_private(screen);
     struct glamor_format *f = &glamor_priv->formats[depth];
     Bool texture_only = FALSE;
+    Bool ret = TRUE;
 
     /* If we're trying to run on GLES, make sure that we get the read
      * formats that we're expecting, since glamor_transfer relies on
@@ -480,10 +481,12 @@ glamor_add_format(ScreenPtr screen, int depth, CARD32 render_format,
                      format, type, NULL);
         if (glGetError() != GL_NO_ERROR)
         {
-            ErrorF("glamor: Cannot upload texture for depth %d.  "
-                   "Falling back to software.\n", depth);
+            if (!dry_run) {
+                ErrorF("glamor: Cannot upload texture for depth %d.  "
+                       "Falling back to software.\n", depth);
+            }
             glDeleteTextures(1, &tex);
-            return;
+            return FALSE;
         }
 
         glGenFramebuffers(1, &fbo);
@@ -492,11 +495,14 @@ glamor_add_format(ScreenPtr screen, int depth, CARD32 render_format,
                                GL_TEXTURE_2D, tex, 0);
         status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
         if (status != GL_FRAMEBUFFER_COMPLETE) {
-            ErrorF("glamor: Test fbo for depth %d incomplete.  "
-                   "Falling back to software.\n", depth);
+            if (!dry_run) {
+                ErrorF("glamor: Test fbo for depth %d incomplete.  "
+                       "Falling back to software.\n", depth);
+            }
             glDeleteTextures(1, &tex);
             glDeleteFramebuffers(1, &fbo);
             texture_only = TRUE;
+            ret = FALSE;
         }
 
         if (!texture_only) {
@@ -508,23 +514,41 @@ glamor_add_format(ScreenPtr screen, int depth, CARD32 render_format,
         glDeleteFramebuffers(1, &fbo);
 
         if (!texture_only && (format != read_format || type != read_type)) {
-            ErrorF("glamor: Implementation returned 0x%x/0x%x read format/type "
-                   "for depth %d, expected 0x%x/0x%x.  "
-                   "Falling back to software.\n",
-                   read_format, read_type, depth, format, type);
+            if (!dry_run) {
+                ErrorF("glamor: Implementation returned 0x%x/0x%x read format/type "
+                       "for depth %d, expected 0x%x/0x%x.  "
+                       "Falling back to software.\n",
+                       read_format, read_type, depth, format, type);
+            }
             texture_only = TRUE;
+            ret = FALSE;
         }
     }
 
-    f->depth = depth;
-    f->render_format = render_format;
-    f->internalformat = internalformat;
-    f->format = format;
-    f->type = type;
-    f->rendering_supported = rendering_supported;
-    f->texture_only = texture_only;
+    if (!dry_run) {
+        f->depth = depth;
+        f->render_format = render_format;
+        f->internalformat = internalformat;
+        f->format = format;
+        f->type = type;
+        f->rendering_supported = rendering_supported;
+        f->texture_only = texture_only;
+    }
+
+    return ret;
 }
 
+static void
+glamor_add_format(ScreenPtr screen, int depth, CARD32 render_format,
+                  GLenum internalformat, GLenum format, GLenum type,
+                  Bool rendering_supported)
+{
+    glamor_add_format2(screen, depth, render_format,
+                       internalformat, format, type,
+                       rendering_supported, FALSE);
+}
+
+
 /* Set up the GL format/types that glamor will use for the various depths
  *
  * X11's pixel data doesn't have channels, but to store our data in GL
@@ -586,11 +610,34 @@ glamor_setup_formats(ScreenPtr screen)
                       GL_RGB, GL_RGB, GL_UNSIGNED_SHORT_5_6_5, TRUE);
 
     if (glamor_priv->is_gles) {
+        /**
+         * We don't really care if we're using GL_BGRA or GL_RGBA,
+         * we just care that the internalformat and format match.
+         */
+        Bool ret;
         assert(X_BYTE_ORDER == X_LITTLE_ENDIAN);
-        glamor_add_format(screen, 24, PIXMAN_x8r8g8b8,
-                          GL_BGRA, GL_BGRA, GL_UNSIGNED_BYTE, TRUE);
-        glamor_add_format(screen, 32, PIXMAN_a8r8g8b8,
-                          GL_BGRA, GL_BGRA, GL_UNSIGNED_BYTE, TRUE);
+        ret = glamor_add_format2(screen, 24, PIXMAN_x8r8g8b8,
+                                 GL_BGRA, GL_BGRA, GL_UNSIGNED_BYTE,
+                                 TRUE, TRUE);
+        if (ret) {
+            glamor_add_format(screen, 24, PIXMAN_x8r8g8b8,
+                              GL_BGRA, GL_BGRA, GL_UNSIGNED_BYTE, TRUE);
+        } else {
+            glamor_add_format(screen, 24, PIXMAN_x8r8g8b8,
+                              GL_RGBA, GL_RGBA, GL_UNSIGNED_BYTE, TRUE);
+        }
+
+        ret = glamor_add_format2(screen, 32, PIXMAN_a8r8g8b8,
+                                 GL_BGRA, GL_BGRA, GL_UNSIGNED_BYTE,
+                                 TRUE, TRUE);
+        if (ret) {
+            glamor_add_format(screen, 32, PIXMAN_a8r8g8b8,
+                              GL_BGRA, GL_BGRA, GL_UNSIGNED_BYTE, TRUE);
+
+        } else {
+            glamor_add_format(screen, 32, PIXMAN_a8r8g8b8,
+                              GL_RGBA, GL_RGBA, GL_UNSIGNED_BYTE, TRUE);
+        }
     } else {
         glamor_add_format(screen, 24, PIXMAN_x8r8g8b8,
                           GL_RGBA, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, TRUE);

This also works, but it just seems like it would be adding hacks.
We're only requiring that the format matches exactly on GLES, and we allow mismatches on regular GL contexts.
These checks looks to me like we're needlessly differentiating between GLES and Gl.

@notbabaisyou
Copy link
Contributor

notbabaisyou commented Jan 26, 2026

Had something like notbabaisyou@2a5e17a in mind.

Could also have the code just flip the expected format for NVIDIA only on the GLES path like:
GLint expected_format_es = glamor_priv->is_nvidia ? GL_RGBA : GL_BGRA

@stefan11111
Copy link
Contributor Author

Had something like notbabaisyou@2a5e17a in mind.

Could also have the code just flip the expected format for NVIDIA only on the GLES path like: GLint expected_format_es = glamor_priv->is_nvidia ? GL_RGBA : GL_BGRA

That would be a vendor-specific hack, and would only work with the xfree86 X server.

@notbabaisyou
Copy link
Contributor

That would be a vendor-specific hack

Yes, it would be, GLAMOR already has vendor-specific hacks.

and would only work with the xfree86 X server.

Huh? What do you mean by this?

@stefan11111
Copy link
Contributor Author

stefan11111 commented Jan 26, 2026

and would only work with the xfree86 X server.

Huh? What do you mean by this?

The "main" X server, the one at hw/xfree86. Not sure what to call it.
The nvidia check is done in glamor/glamor_egl.c, which is only used by that X server.
It would not work in Xfbdev, for example.

@notbabaisyou
Copy link
Contributor

Ah, I see what you mean, you could just have the GL_VENDOR check in glamor_init which is where the use_quads hack is already located in

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants