Is it possible to bind a HTML Canvas to an ffmpeg JavaScript port to get a video file as output without a server?

First of all, I’ve found this repository which has a ffmpeg JavaScript implementation:
https://github.com/ffmpegwasm/ffmpeg.wasm

I’m curious if I can somehow bind my canvas output and pass some frametimes to get a video output. (for example, visualize a physics object)

So far, I’ve set up a basic physics simulator in JS. I have a bunch of squares being rendered based on their x and y coordinates.

class PhysicsObject {
  // ...
  render(canvas, ctx) {
    ctx.fillStyle = this.color;
    ctx.fillRect(this.x - this.w / 2, this.y - this.h / 2, this.w, this.h);
  }
  // ...
}

let timer = performance.now();
// ...

function draw() {
  // ...
  let now = performance.now();
  dt = (now - timer) / 1000;
  timer = now;
  // ...

  for (let object of physicsObjects) {
    // ...
    object.update(dt);
    object.render(canvas, ctx);
    // ...
  }
  requestAnimationFrame(draw);
}

I now need a way to link my canvas output to the ffmpeg and some other parameters but I have no idea where to even start.

If there is a way to bind the canvas output to the ffmpeg port, I’d like to delve deeper into the documentation of this ffmpegwasm thing.