WebGL 2.0 unsigned integer input variable

I’ve been trying to point an Uint32Array buffer to a uint input variable.

All the information I can find online says that it should be possible, yet I get the same error no matter what I do:

[.WebGL-0x62401b7e200] GL_INVALID_OPERATION: Vertex shader input type does not match the type of the bound vertex attribute.

Here’s my vertex shader:

#version 300 es
in vec4 a_ipos;
in uint a_cdata;
uniform vec2 ures;

void main() {
    uint x = a_cdata & 0x7C00u;
    uint y = a_cdata & 0x03E0u;
    uint z = a_cdata & 0x001Fu;

    vec4 pos = a_ipos + vec4(float(x), float(y), float(z), 1.);

    gl_Position = vec4(pos.x * ures.y / ures.x, pos.yzw);
}

And here’s my call to webgl to point the buffer to the attribute:

gl.bindBuffer(gl.ARRAY_BUFFER, chunkBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Uint32Array(chunkData), gl.STATIC_DRAW);
gl.enableVertexAttribArray(cdataloc);
gl.vertexAttribPointer(
    cdataloc,
    1,
    gl.UNSIGNED_INT,
    false,
    0,
    0
);

It seems gl.UNSIGNED_INT is not the same type as uint.
However the GLSL 300 ES reference card says that it’s a 32-bit unsigned integer.
MDN agrees that gl.UNSIGNED_INT is 32-bit unsigned integer.

I have no idea what I’m doing wrong. I’ve tried using gl.INT and in int ... and still nothing. Changing the precision of the integers to highp also doesn’t change anything. (provided highp would convert integers to 32-bit, which doesn’t seem to be the case).

Changing the type to float does work. But any other type I use doesn’t.

I also tried using gl.SHORT. This is to discard the theory that the int type in glsl might be 16-bit integer. Still the same error.

This question: WebGL: How to Use Integer Attributes in GLSL

Doesn’t solve my issue. As the documentation the answer provides is outdated. GLSL 1.00 ES doesn’t allow integers in attributes as per specification.

However, GLSL 3.00 ES specification doesn’t seem to care as long as the type is not one of the following: boolean, opaque, array, struct.