How to process streaming data from a node API in Nextjs 14?

I have a streaming endpoint that is an LLM and gives a response like gpt that I am hitting using my proxy node server. This is my nodejs post API code:

  const postData = {
    messages: messages,
    model_name,
    stream: true,
    temperature: temperature,
    top_p: 1,
  };

  try {
    const headers = {
      "Content-Type": "application/json",
      apiKey: process.env.LLM_API_KEY,
    };

    const response = await axios.post(llmChatUri, postData, {
      headers: headers,
      responseType: "stream",
    });

    res.setHeader("Content-Type", "application/json");
    response.data.pipe(res);

I want to get the data the API returns in nextjs frontend chunk by chunk. The problem I face is with parsing the data as some chunks are lost resulting in data loss.

I tried using socket io and more techniques to turn data into a JSON obj so that I can extract the answer one by one and show in the frontend but either I face data loss and data transformation from string to JSON has been as hassle as well.