-
Notifications
You must be signed in to change notification settings - Fork 277
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
EXT_texture_sRGB_decode assumes S3TC sRGB formats introduces in EXT_texture_compression_s3tc #341
Comments
Discussed in the OpenGL/ES joint working group meeting and we agree with your understanding. Would you like to propose a PR to fix the extension? |
Assuming PR==pull request. If needed I can do that, but I prefer not to. I'm not too familiar with how to exactly phrase it for the spec/ext. |
No worries. How does this look: #344 |
The
but inside it discusses the dependency on both exts, I think it rather needs to be either
Or have two sections:
and
which both do remove the formats individually. |
What Apple platform was this, and what version of OpenGL or OpenGL ES did you see this issue? The reason we ask is that GL_EXT_texture_sRGB was promoted to OpenGL core in version 2.1, Also the way the GL_EXT_texture_sRGB_decode extension was written is that is actually requires sRGB support: "OpenGL 2.1 or EXT_texture_sRGB requried for OpenGL". In other words, you can't expose this extension at all if GL_EXT_texture_sRGB is not available. Which makes us wonder if this isn't a spec bug, but a bug on that Apple platform in that it's exposing GL_EXT_texture_sRGB_decode without the other required core or extensions. It also makes little sense to expose GL_EXT_texture_sRGB_decode without supporting sRGB textures. |
It wasn't my apple pc, but someone sending me a list of the specific extensions on his mac with intel/nvidia(?) graphics beeing used. I'll ask him which it was specifically, but it certainly was OpenGL 4+. But it also matches this table: https://developer.apple.com/opengl/OpenGL-Capabilities-Tables.pdf where OpenGL versions 3.2 and above do not expose the GL_EXT_texture_sRGB extension anymore.
I think this might be the key issue here. Either EXT_texture_sRGB was modified after OpenGL 2.1 release to include the sRGB S3TC formats or it was never fully promoted to OpenGL core in 2.1 in the first place, because the S3TC sRGB formats EXT_texture_sRGB defines, when EXT_texture_compression_s3tc is present, never made it into the OpenGL 2.1 spec. But now in scope of EXT_texture_sRGB_decode "OpenGL 2.1 or EXT_texture_sRGB" end up beeing not equivalent requirements, when it comes to the S3TC sRGB formats. Either way, I think it's an unfortunate oversight (not necessarily bug) in the Apple OpenGL implemention to not expose EXT_texture_sRGB, because I would be really supprised if they wouldn't support the S3TC sRGB texture formats in general. /edit: |
EXT_texture_compression_s3tc only introduces the non sRGB S3TC formats, where as the EXT_texture_sRGB extension actually introduces the COMPRESSED_SRGB_S3TC_DXT1_EXT, COMPRESSED_SRGB_ALPHA_S3TC_DXT1_EXT, COMPRESSED_SRGB_ALPHA_S3TC_DXT3_EXT, and COMPRESSED_SRGB_ALPHA_S3TC_DXT5_EXT formats.
Now inside EXT_texture_sRGB_decode spec:
If I understand this correctly, this needs to be:
Apple implementation seems to expose EXT_texture_sRGB_decode and EXT_texture_compression_s3tc, but not EXT_texture_sRGB.
The text was updated successfully, but these errors were encountered: