Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WebGPURenderer: Align integer attribute check of WebGL backend. #28918

Merged
merged 2 commits into from
Jul 19, 2024

Conversation

Mugen87
Copy link
Collaborator

@Mugen87 Mugen87 commented Jul 19, 2024

Fixed #28898.

Description

This PR aligns the WebGL backend of WebGPURenderer to WebGLRenderer when deciding whether buffer data are integer or not.

Only gl.INT or gl.UNSIGNED_INT are integer data input or when attribute.gpuType is set to IntType.

const integer = ( type === gl.INT || type === gl.UNSIGNED_INT || geometryAttribute.gpuType === IntType );

@Mugen87 Mugen87 added this to the r167 milestone Jul 19, 2024
Copy link

github-actions bot commented Jul 19, 2024

📦 Bundle size

Full ESM build, minified and gzipped.

Filesize dev Filesize PR Diff
683.7 kB (169.3 kB) 683.7 kB (169.3 kB) +0 B

🌳 Bundle size after tree-shaking

Minimal build including a renderer, camera, empty scene, and dependencies.

Filesize dev Filesize PR Diff
460.9 kB (111.2 kB) 460.9 kB (111.2 kB) +0 B

@Mugen87 Mugen87 changed the title WebGLAttributeUtils: Fix integer check. WebGPURenderer: Align integer attribute check of WebGL backend. Jul 19, 2024
@Mugen87 Mugen87 merged commit b09742f into mrdoob:dev Jul 19, 2024
12 checks passed
@Mugen87
Copy link
Collaborator Author

Mugen87 commented Jul 19, 2024

const integer = ( type === gl.INT || type === gl.UNSIGNED_INT || geometryAttribute.gpuType === IntType );

Um, I think this policy can be changed. With WebGL 2 singed and unsigned int32 are also a valid inputs for float attributes in the shader. So maybe it's best to just change the line to:

const integer = ( geometryAttribute.gpuType === IntType );

@@ -476,7 +476,7 @@ ${ flowData.code }

const array = dataAttribute.array;

if ( ( array instanceof Uint32Array || array instanceof Int32Array || array instanceof Uint16Array || array instanceof Int16Array ) === false ) {
if ( ( array instanceof Uint32Array || array instanceof Int32Array || array instanceof Int16Array ) === false ) {
Copy link
Collaborator

@RenaudRohlinger RenaudRohlinger Jul 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This just breaks Uint16Array support in the WebGL Backend. And now generate this kind of error:

[.WebGL-0x13000c4ea00] GL_INVALID_OPERATION: Vertex shader input type does not match the type of the bound vertex attribute.

Firefox:
WebGL warning: drawElementsInstanced: Vertex attrib 1 requires data of type INT, but is being supplied with type FLOAT.

Are you sure about this one? /cc @Mugen87

Copy link
Collaborator Author

@Mugen87 Mugen87 Jul 24, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This change was required to make compressed models work with the WebGL backend so I would say the previous approach wasn't right.

Do you mind demonstrating with a fiddle how the error occurs?

Ideally, the GLSL builder should only generate iuvec* if the gpuType is IntType or when Uint32Array and Int32Array is used. In all other cases, the shader type should be float.

Related: #28920 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

WebGPURenderer error loading GLTF
2 participants