Streaming audio from a PyAudio stream to be played on a webpage in Javascript

I’m trying to take an audio stream from my computer using PyAudio (specifically the pyaudiowpatch fork), stream that audio data over a websocket (using the websockets library, and play it on a web page using javascript. To do this, I have the following Python code on the server side:

import pyaudiowpatch as pyaudio

import asyncio
from websockets.server import serve


CHUNK_SIZE = 512


async def main():
    with pyaudio.PyAudio() as p:
        # 23 is the index of the device i'm trying to get the data from
        audio_device = p.get_device_info_by_index(23)
        with p.open(
            format=pyaudio.paInt16,
            channels=audio_device["maxInputChannels"],
            rate=int(audio_device["defaultSampleRate"]),
            frames_per_buffer=CHUNK_SIZE,
            input=True,
            input_device_index=audio_device["index"]
        ) as stream:
            async def handler(ws):
                print(f"Connection from {ws}")
                while stream.is_active():
                    chunk = stream.read(CHUNK_SIZE)
                    await ws.send(chunk)
                    print(f"Sent chunk to {ws}")
                print(f"Closing connection to {ws}")

            async with serve(handler, host="localhost", port=8081):
                print("Listening...")
                await asyncio.Future()

asyncio.run(main())

This code works fine, and when I make a connection from the web page, the data is sent over just fine. My client-side js is as follows:

const host = "ws://localhost:8081"
const ws = new WebSocket(host, "testProtocol")
const audioContext = new AudioContext()
const source = audioContext.createBufferSource()

source.connect(audioContext.destination)

document.getElementById("playButton").onclick = () => {
    source.start()
}

ws.onmessage = (event) => {
    event.data.arrayBuffer().then(data => {
        audioContext.decodeAudioData(data, (buffer) => {
            source.buffer = buffer
        })
    })
}

The idea is that the ws client receives the data as a blob, and then turns it into an ArrayBuffer to be played by the Web Audio API. However, I get the following error from the js: Uncaught (in promise) DOMException: The buffer passed to decodeAudioData contains an unknown content type. Clearly this means that the audio data I am passing is not in a format that the Web Audio API can understand.

What is the correct way to play the audio being streamed over to the web page?