Use pipeThrough/pipeTo to transform then stream to Azure Blob

I’m trying to stream some data from an API, transform it, then upload to Azure blob using an an Azure function. The limitation I face using Azure Blobstore is that there’s no ‘writeStream’, only uploadStream. There’s no obvious point in the pipeline to upload to a stream.

Example –

   function MyWriter(writer) {
      return new WritableStream({
        write(chunk) {
          console.log('writing');
          writer.write(chunk);
        },
      });
    }

    export const generate = async (iterable) => { 
      await ReadableStream.from(iterable);
        .pipeThrough(new TextDecoderStream())
        .pipeThrough(new MyTransformer())
        .pipeThrough(new CompressionStream('gzip'))
        .pipeTo(MyWriter(writeOutStream));

    /*. await blockBlobClient.uploadStream(); */
    }

Am i missing something obvious?

Azure upload stream docs – https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-upload-javascript#upload-a-block-blob-from-a-stream

Tried various combinations of pipelining, but there’s no obvious point to upload the stream.