Handling files as ReadableStream, is it fully supported?

Hello everyone,

I’m working on a UXP plugin for Photoshop and I need to be able to read and write very large files (for upload and download). As per the documentation, I’ve been using the ReadableStream approach (Web APIs - ReadableStream). However, I’m encountering a couple of issues:

    • First, this method doesn’t seem to be memory efficient. I’m consistently receiving warnings about memory usage.
    • Specifically, in the case of downloading files, the resulting file is often corrupted or unplayable.
    const file = await UXPfolder.createFile(toDownload[index].asset.name, {
      overwrite: true,
    });
    
    const fd = await fs.open(file.nativePath, "w+");
    let bytesWrittenInTotal = 0;

    fetch(toDownload[index].link)
      .then(response => {
    
        const reader = response.body.getReader();
        return new ReadableStream({
          pull(controller) {

            function push() {
              reader.read().then(({ done, value }) => {
                if (done) {
                  controller.close();
                  fs.close(fd);
                  return;
                }
                const buffer = new Uint8Array(value);
                fs.write(fd, buffer.buffer, 0, buffer.length, bytesWrittenInTotal).then(({ bytesWritten }) => {
                  console.log("bytesWritten :", bytesWritten)
                  bytesWrittenInTotal += bytesWritten;
                });
                controller.enqueue(buffer);       
               

                push();
              }).catch(error => {
                console.error('Error reading stream:', error);
                controller.error(error);
                fs.close(fd);
              });
            }
            push();
          }
        });
      })
      .then(stream => {
        console.log("stream :", stream)
      })
      .catch(error => console.error('Fetch error:', error));

I’m wondering if anyone here has successfully implemented this approach and could offer some guidance?
Alternatively, does anyone have suggestions for a better method to manage the manipulation of large files without running into memory issues?

Thanks in advance for your insights!

@aerb
Would you please refer to the sample here
ReadableStream - Web APIs | MDN?
or
I think you can use ReadableStream.pipeTo(WritableStream).
WritableStream object would be created as follows.

const ws = new WritableStream({
    async start(controller) {
        try {
            fd = await fs.open(path, "w+");
            totalBytesWritten = 0;
            ...
        } catch (e) {
            controller.error(e);
        }
    },
    async write(chunk, controller) {
        try {
            const arrayBufferSize = chunk.byteLength;
            let bytesToWrite = chunk.byteLength;

            while (bytesToWrite !== 0) {
                const { bytesWritten } = await fs.write(
                    fd, chunk, arrayBufferSize - bytesToWrite, chunk.byteLength, totalBytesWritten);
                bytesToWrite -= bytesWritten;
                totalBytesWritten += bytesWritten;
            }
        } catch (e) {
            controller.error(e);
        }
    },
    async close() {
        await fs.close(fd);
    },
    async abort() {
    },
});

const response = await fetch(URL, ...);
await response.body.pipeTo(ws);
......

fetch().response.body is a ReadableStream.
I think your code is creating another ReadableStream. That means there would be two copies of the same ReadableStream objects, which consumes the memory double.

I think ReadableStream.pipeTo() API is the right one for your purpose.
WritableStream.write(chunk, controller) API would be called repeatedly when data is ready. (WritableStream: WritableStream() constructor - Web APIs | MDN)