Is there a way to proxy a streamed answer in AWS Lambda js?

I’m using loopback 4 as backend framework and I deploy my APIs on AWS lambda using Serverless framework (followed this guide: https://medium.com/@hitesh.gupta/running-loopback-v4-on-aws-lambda-56064a97b5c3). I want to make a Bedrock API and stream model response (as a GPT like) but I can’t manage to stream my response as it was stuck in buffering mode.

serverless.yml

functions:
  stream:
    handler: lambda.handler
    timeout: 60
    url:
      cors: true
      invokeMode: RESPONSE_STREAM
  app:
    handler: lambda.handler # reference the file and exported method
    events: # events trigger lambda functions
      - http: # this is an API Gateway HTTP event trigger
          path: /
          method: ANY
          cors: true
      - http: # all routes get proxied to the Express router
          path: /{proxy+}
          method: ANY
          cors: true

lambda.js

const awsServerlessExpress = require('aws-serverless-express');
const AWS = require('aws-sdk');
const application = require('./dist');
const app = new application.StudyApplication({
  rest: {
    openApiSpec: {
      setServersFromRequest: true,
    },
  },
});
const server = awsServerlessExpress.createServer(app.restServer.requestHandler);
exports.handler = async (event, context) => {
  console.log('Event: ', event);

  await app.boot();
  return awsServerlessExpress.proxy(server, event, context, 'PROMISE').promise;
};

my test controller:

import {inject} from '@loopback/core';
import {post, Response, RestBindings} from '@loopback/rest';
import {Readable} from 'stream';
// import {inject} from '@loopback/core';
export class PromptController {
  constructor() {}
  @post('/prompt', {
    responses: {
      '200': {
        description: 'Stream fixed text in chunks',
        content: {
          'text/event-stream': {},
        },
      },
    },
  })
  async streamFixedText(
    @inject(RestBindings.Http.RESPONSE) response: Response,
  ): Promise<void> {
    response.set('Content-Type', 'text/event-stream');
    response.set('Connection', 'keep-alive');
    const chunks = [
      'This is the first chunk of the text.',
      ' Here comes the second chunk.',
      ' And finally, the last chunk.',
    ];

    const readable = new Readable({
      read() {
        let index = 0;

        const intervalId = setInterval(() => {
          if (index < chunks.length) {
            this.push(`data: ${chunks[index]}nn`);
            index++;
          } else {
            this.push(null);
            clearInterval(intervalId);
          }
        }, 1000);
      },
    });

    response.on('close', () => {
      readable.destroy();
    });

    readable.pipe(response);
  }
}

I don’t know if there is a way to make it works.

Thanks a lot for taking the time to read my issue.
ITZouzouille

tried using streamifyResponse lib but it seems that the event object when requesting a lambda url isn’t the same as when we pass through API Gateway.