[bug] Upload psd file

Hi there!

I’m trying to upload my PSD file using the strategy of duplicate the active document and save this document to a temporary folder, then read this duplicated file so I can upload this to my S3 Bucket.
For now, the upload works fine, but I’m struggling with the read method.

My PSD file has around 1.6 MB but when I read the file with the following code I get an ArrayBuffer of 997180 of size (~997.2 kB):

let data = await tempFile.read({ format: UXP.storage.formats.binary }); # => ArrayBuffer(997180)

When I upload to S3 via fetch, it works fine, but when I download the PSD file I get a file with 997.2 kB of size, this leads to some error when I try to use this downloaded PSD file with ImageMagick (identify command):

identify original.psd # identifies N layers
identify downloaded.psd # identifies N - M layers (this M is the number of vector or solidColor layers)

Can anyone help me with this? I don’t know why the .read() method returns an ArrayBuffer of 997180 (~ 997 kB) instead of ~1600000 (~1.6MB) :confused:

I just noticed that the duplicate command is making the file more light.
I’m using the output command for duplicate action by Alchemist plugin:

const result = await batchPlay(
   [
      {
         "_obj": "duplicate",
         "_target": [
            {
               "_ref": "document",
               "_enum": "ordinal",
               "_value": "first"
            }
         ],
         "name": documentName,
         "documentID": documentId,
         "_isCommand": true,
         "_options": {
            "dialogOptions": "dontDisplay"
         }
      }
   ],{
      "synchronousExecution": false,
      "modalBehavior": "fail"
   });

Anyone know why this duplicate command is making my file more lighter than it has to be?

Do you have multiple documents opened in Photoshop?
Your descriptor duplicates the first document, not (necessarily) the currently active one.

Nope, just one (and I always pass the documentId of the photoshop document I’m working with).

I noticed that this happens not only in the descriptor command, but in the built-in Photoshop command duplicate. So this is not a bug with UXP API, but a pattern in Photoshop :confused:

Just tested it, for me the file size stays identical when duplicating a document.
Have you tried it with another document?

There are some things that influence the file size, for example whether there’s a thumbnail saved inside of the PSD (The dialog that asks you to “maximize compatibility” when saving).
Maybe also something related to smart objects or linked files… just an idea.

1 Like

Yep, that is it!
Just test it in here, the dialog asking whether I want or not maximum compatibility (when I flag it, it keeps the file size, otherwise the file size is reduced).
This dialog was not being showing to me until today, maybe the dialog was not being asked and the maximum compatibility was always false.

thx!

Just to help anyone that has the same problem, what it works for me was:

await require('photoshop').app.activeDocument.save(tempFile, { maximizeCompatibility: true });

Just wondering, why was it a problem for you in the first place? Isn’t a lighter file size something desirable, especially if you just store it temporarily in an S3 Bucket.
Setting the maximizeCompatibility to false could save you 40-70% of the network traffic for large documents, without even losing any data (You can set it back to true once you downloaded it at a later point).

It’s because setting maximizeCompatibility to false, when I manipulate this file in my API backend, some information I cannot retrieve very well (like solidColor/vector layers).

The whole process after photoshop upload the psd file is under Linux OS, that’s why I cannot set it back. :slight_smile:

@guircasimiro How did you upload arrayBuffer file to S3 via fetch?

1 Like