-
-
Notifications
You must be signed in to change notification settings - Fork 35.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebGPURenderer: Align integer attribute check of WebGL backend. #28918
Conversation
📦 Bundle sizeFull ESM build, minified and gzipped.
🌳 Bundle size after tree-shakingMinimal build including a renderer, camera, empty scene, and dependencies.
|
Um, I think this policy can be changed. With WebGL 2 singed and unsigned int32 are also a valid inputs for float attributes in the shader. So maybe it's best to just change the line to: const integer = ( geometryAttribute.gpuType === IntType ); |
@@ -476,7 +476,7 @@ ${ flowData.code } | |||
|
|||
const array = dataAttribute.array; | |||
|
|||
if ( ( array instanceof Uint32Array || array instanceof Int32Array || array instanceof Uint16Array || array instanceof Int16Array ) === false ) { | |||
if ( ( array instanceof Uint32Array || array instanceof Int32Array || array instanceof Int16Array ) === false ) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This just breaks Uint16Array
support in the WebGL Backend. And now generate this kind of error:
[.WebGL-0x13000c4ea00] GL_INVALID_OPERATION: Vertex shader input type does not match the type of the bound vertex attribute.
Firefox:
WebGL warning: drawElementsInstanced: Vertex attrib 1 requires data of type INT, but is being supplied with type FLOAT.
Are you sure about this one? /cc @Mugen87
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change was required to make compressed models work with the WebGL backend so I would say the previous approach wasn't right.
Do you mind demonstrating with a fiddle how the error occurs?
Ideally, the GLSL builder should only generate iuvec*
if the gpuType
is IntType
or when Uint32Array
and Int32Array
is used. In all other cases, the shader type should be float.
Related: #28920 (comment)
Fixed #28898.
Description
This PR aligns the WebGL backend of
WebGPURenderer
toWebGLRenderer
when deciding whether buffer data are integer or not.Only
gl.INT
orgl.UNSIGNED_INT
are integer data input or whenattribute.gpuType
is set toIntType
.three.js/src/renderers/webgl/WebGLBindingStates.js
Line 345 in 9834113