How to stream realtime microphone stream from the browser to icecast using nodejs as a middleware

I am trying to build a web-based Icecast client using nodeJS as a middleware server.

in high level what I’m trying to achieve is

  1. WebApp captures the user’s microphone using navigator.mediaDevices.getUserMedia({ audio: true });
  2. WebApp streams the microphone feed to nodeJS server (using WebRTC or Socket)
  3. NodeJS server relays the microphone stream to an Icecast server using nodeshout

So far I was able to stream a static file from my nodejs server to the icecast server, and stream a 10 sec buffer of the captured microphone feed to the node server using a socket to nodeServer.
but I’m missing the steps needed to convert the audio/webm buffer to a streamable mp3.

I’ve tried converting it using ffmpeg but it doesn’t seem to work with live streams.

is there any way to achieve this?
i don’t mind switching to a different language for the middleware if that’s not possible on nodejs