How can I use a matrix from CSS’s matrix3d in a WebGL2 shader?

I currently have a CSS matrix3d transform I like, let’s say this one for example:

transform: matrix3d(1.3453,0.1357,0.0,0.0003,0.2096,1.3453,0.0,0.0003,0.0,0.0,1.0,0.0,-100.0,-100.0,0.0,1.0);
transform-origin: 0 0;

I am using a WebGL2 canvas and I’d like an image I’ve drawn on it to have the same transformation.

To achieve this, I believe I need a vertex shader to take in the matrix and multiply it:

#version 300 es

in vec2 a_position;
in vec2 a_texCoord;
uniform vec2 u_resolution;
out vec2 v_texCoord;

uniform mat4 u_matrix;

void main() {
  vec2 zeroToOne = a_position / u_resolution;
  vec2 zeroToTwo = zeroToOne * 2.0;
  vec2 clipSpace = zeroToTwo - 1.0;

  gl_Position = u_matrix*vec4(clipSpace * vec2(1, -1), 0, 1);
  v_texCoord = a_texCoord;
}

I can then pass the matrix to the shader in my JavaScript code:

const matrixLocation = gl.getUniformLocation(program, "u_matrix");
gl.uniformMatrix4fv(matrixLocation, false, new Float32Array(matrix));

What I’m currently stuck at is figuring out what the matrix value in my code should be. I’ve tried reading up on how the matrix3d transform is made up, and rearranging the matrix based on that, but with no luck.

How can I use a matrix3d transform in the WebGL2 shader here?