I’m attempting to build an audio player component in a React/Next.js app with wavesurfer visualisation. My audio data is basically binary/wave audio and I have successfully been able to “play” the audio with an AudioContext and AudioBuffer.
I’ve hit 2 blockers that I haven’t made progress with despite researching similar questions:
- (Simple?) I cannot “connect” my AudioContext(and created
audioBuffer) to a<audio />DOM element (always shows 0s/controls disabled) - I do not know how to generate an
HTMLMediaElementwith my audio data (audioBufferin below code) to pass into Wavesurfer.
Note – I’ve seen similar examples which use URL.createObjectURL() – these URL’s return 404 in nextJS so I haven’t had success using this. Maybe this is the missing piece and the ways I’ve tried aren’t possible.
How do I proceed with this audio player component?
Code so far..
import { FC, useEffect, useRef, useState } from "react";
import { Box, Button, } from "@chakra-ui/react";
import { fetchAudioData } from "@/api/v1/audio";
import WaveSurfer from "wavesurfer.js";
import { AudioPlayerProps } from "./AudioPlayer.types";
export const MyAudioPlayer: FC<AudioPlayerProps> = ({}) => {
const waveSurferRef = useRef(null);
const audioRef = useRef<HTMLAudioElement | null>(null);
const [audioContext, setAudioContext] = useState<AudioContext | null>(null);
const [audioBuffer, setAudioBuffer] = useState<AudioBuffer | null>(null);
const [wavesurfer, setWaveSurfer] = useState<WaveSurfer | null>(null);
const setupAudio = async () => {
const data: Uint8Array = await fetchAudioData(); // Remote call for data
const audioCtx = new window.AudioContext();
// Create AudioBuffer (my audio format: mono, unsigned 8-bit, 44100hz)
const audioBuffer = audioCtx.createBuffer(
1, // Channels (mono)
data.length, // Number of samples
441000 // Sample rate
);
const channelData = audioBuffer.getChannelData(0);
// Copy audio data into the channel data array
for (let i = 0; i < data.length; i++) {
channelData[i] = data[i] / 255; // Normalize values between -1 and 1
}
setAudioBuffer(audioBuffer);
setAudioContext(audioCtx);
};
const prepareAudioElement = () => {
if (audioRef.current && audioContext) {
audioContext.createMediaElementSource(audioRef.current);
audioRef.current.controls = true;
audioRef.current.play();
}
};
const prepareWaveSurfer = () => {
if (waveSurferRef.current && audioBuffer) {
const ws = WaveSurfer.create({
container: waveSurferRef.current,
waveColor: "rgb(200, 0, 200)",
progressColor: "rgb(100, 0, 100)",
barWidth: 10,
barRadius: 10,
barGap: 2,
// media: ?, // UNSURE HOW TO CREATE THIS WITH MY AUDIO CONTEXT
peaks: [audioBuffer.getChannelData(0)],
duration: 1, // Todo: populate from audio track
mediaControls: true,
});
setWaveSurfer(ws);
}
};
useEffect(() => {
prepareAudioElement();
prepareWaveSurfer();
}, [audioRef, audioContext, audioBuffer, waveSurferRef]);
const playMyAudio = () => {
if (audioRef.current && audioContext) {
audioRef.current.play(); // DOES NOTHING
// PLAY AUDIO IMMEDIATELY DIRECTly FROM CONTEXT
// const source = audioContext.createBufferSource();
// source.buffer = audioBuffer;
// source.connect(audioContext.destination);
// source.start();
}
};
return (
<Box w="500px">
<div ref={waveSurferRef} />
<audio ref={audioRef} controls />
<Button onClick={() => setupAudio()}>Setup Audio Track</Button>
<Button onClick={() => playMyAudio()}>Play</Button>
</Box>
);
};
Similar questions I’ve looked at:
React Web Audio API – Play, pause and export loaded audio file
Using AudioBuffer as a source for a HTMLAudioElement?
convert audio buffer to play as audio element
https://webaudioapi.com/samples/audio-tag/
TypeScript/React with Web Audio API errors with createMediaElementSource


