OpenAI Beta Assistant stream response in NextJS

I’m developing an App using the OpenAI API. I’m using the assistant API because I’m training an assistant to answer based on one specific document I’ve already uploaded at https://platform.openai.com/playground?assistant.

The documentation of OpenAI about the streaming answers leads me to this code:

   const { messages } = await request.json();
   const userMessage = messages && messages[messages.length - 1];

   const thread = await openai.beta.threads.create();
   
   const message = await openai.beta.threads.messages.create(thread.id, {
      ...userMessage,
   });

   let resp = '';

   const run = openai.beta.threads.runs.createAndStream(thread.id, {
      assistant_id: process.env.OPENAI_TABLE_23A_ASSISTANT_ID as string,
   });

   const stream = run.on('textDelta', (textDelta) => {
      // console.log(textDelta.value);
      resp = resp + textDelta.value!;
      return textDelta.value;
   });

   await run.finalRun();

I tested it and the streaming is working on run.on() but I need to pass that response as a stream to the user interface chat section.

At client I’m using:

   // Vercel AI imports
   import { useChat } from 'ai/react';

   export default function ChatIpunt({ isDrawerOpen }: ChatInputProps) {
      const {
         input,
         handleInputChange: aiInputChange,
         handleSubmit,
         isLoading: aiIsLoading,
         messages,
      } = useChat();

      // ...rest of my code

I’m just missing how to pass the stream to the client.

I’ll be so grateful with your help and I hope this can also help other developers

This is my full code

   import { NextRequest, NextResponse } from 'next/server';
   import { openai } from '@/lib/openai';

   export const runtime = 'edge';

   export async function POST(request: NextRequest, response: NextResponse) {
      try {
         const { messages } = await request.json();

         const userMessage = messages && messages[messages.length - 1];

         const thread = await openai.beta.threads.create();

         const message = await openai.beta.threads.messages.create(thread.id, {
            ...userMessage,
         });

         console.log(message);

         let resp = '';

         const run = openai.beta.threads.runs.createAndStream(thread.id, {
            assistant_id: process.env.OPENAI_TABLE_23A_ASSISTANT_ID as string,
         });

         const stream = run.on('textDelta', (textDelta) => {
            // console.log(textDelta.value);
            resp = resp + textDelta.value!;
            return textDelta.value;
         });

         await run.finalRun();

         return new NextResponse(resp);
   } catch (error) {
      console.error('Error occurred:', error);

      return {
         status: 500,
         body: JSON.stringify({ error: 'An error occurred' }),
         headers: {
            'Content-Type': 'application/json',
         },
      };
   }
}