Hello everyone,
I’m working on a UXP plugin for Photoshop and I need to be able to read and write very large files (for upload and download). As per the documentation, I’ve been using the ReadableStream approach (Web APIs - ReadableStream). However, I’m encountering a couple of issues:
-
- First, this method doesn’t seem to be memory efficient. I’m consistently receiving warnings about memory usage.
-
- Specifically, in the case of downloading files, the resulting file is often corrupted or unplayable.
const file = await UXPfolder.createFile(toDownload[index].asset.name, {
overwrite: true,
});
const fd = await fs.open(file.nativePath, "w+");
let bytesWrittenInTotal = 0;
fetch(toDownload[index].link)
.then(response => {
const reader = response.body.getReader();
return new ReadableStream({
pull(controller) {
function push() {
reader.read().then(({ done, value }) => {
if (done) {
controller.close();
fs.close(fd);
return;
}
const buffer = new Uint8Array(value);
fs.write(fd, buffer.buffer, 0, buffer.length, bytesWrittenInTotal).then(({ bytesWritten }) => {
console.log("bytesWritten :", bytesWritten)
bytesWrittenInTotal += bytesWritten;
});
controller.enqueue(buffer);
push();
}).catch(error => {
console.error('Error reading stream:', error);
controller.error(error);
fs.close(fd);
});
}
push();
}
});
})
.then(stream => {
console.log("stream :", stream)
})
.catch(error => console.error('Fetch error:', error));
I’m wondering if anyone here has successfully implemented this approach and could offer some guidance?
Alternatively, does anyone have suggestions for a better method to manage the manipulation of large files without running into memory issues?
Thanks in advance for your insights!