How should I pass multiple outputs to model.fit()?

I’m working on a simple model with two outputs (image & labels) per input (latent vector.) This is reflected in the model topography with it having one input layer, and then two output points. To describe it briefly it looks roughly like:

const input = tf.input({ shape: [64, 64, 3] })

// multiple convolution and activation layers here. Call them `convolutions`

const imageOutput = tf.layers.conv2dTranspose({ kernelSize: 3, strides: 1, filters: 3 , padding: 'same', activation: 'relu'}).apply(convolutions)

// This is a custom layer I wrote that invokes another model.
const labelLayer = new DiscriminationLayer({ discriminator: this.discriminator }).apply(imageOutput)

this.model = tf.model({ inputs: input, outputs: [genImageOutput, discrimLayer] })
this.model.compile({ optimizer: this.optimizer, loss: ['meanSquaredError', 'binaryCrossentropy'], metrics: ['accuracy'] })

This runs fine and gives no errors. The problem is when I try to call model.fit() to train it. Which I do as so

await this.model.fit(this.getRandomLatentVectors(10), [this.generateTrainingImages(10), tf.ones([10, 1])], {
  batchSize: 2, epochs: 1, validationSplit: 0.1, shuffle: true, verbose: true
})

This keeps generating the following error:

/node_modules/@tensorflow/tfjs-layers/dist/tf-layers.node.js:19223
            if (array.shape.length !== shapes[i].length) {
                            ^

TypeError: Cannot read properties of undefined (reading 'length')

I’ve checked all of the tensors I’m passing in and all have the proper batch and dimension sizes. What is the problem then? Any advice would be appreciated.