I am trying to build a web-based Icecast client using nodeJS as a middleware server.
in high level what I’m trying to achieve is
- WebApp captures the user’s microphone using
navigator.mediaDevices.getUserMedia({ audio: true });
- WebApp streams the microphone feed to nodeJS server (using WebRTC or Socket)
- NodeJS server relays the microphone stream to an Icecast server using nodeshout
So far I was able to stream a static file from my nodejs server to the icecast server, and stream a 10 sec buffer of the captured microphone feed to the node server using a socket to nodeServer.
but I’m missing the steps needed to convert the audio/webm
buffer to a streamable mp3.
I’ve tried converting it using ffmpeg
but it doesn’t seem to work with live streams.
is there any way to achieve this?
i don’t mind switching to a different language for the middleware if that’s not possible on nodejs