Issue with Uploading Files from Remote Storage (Google Drive) to S3 via Transloadit

Hi everyone,

I am using Uppy along with the Transloadit plugin to handle file uploads. My goal is to upload files directly from Google Drive (or other remote storage providers) to an S3 bucket using Transloadit’s /s3/store robot. However, I am encountering issues during this process.

When I upload a file from Google Drive via Uppy

  1. The file is fetched successfully by Companion.|
  2. Transloadit initializes an assembly and returns the status ASSEMBLY_UPLOADING.
  3. However, the assembly does not progress, and the file is not uploaded to S3.

The log shows

Retrying upload due to: s3: bucket key must be a string or a function resolving the bucket string

It works perfectly fine when I upload through my device or camera. I sign it in my backend

  useEffect(() => {
    const uppyInstance = new Uppy({
      restrictions: {
        maxFileSize: 10485760,
        maxNumberOfFiles: 10,
        allowedFileTypes: ['image/*', 'video/*', 'application/*'],
      },
      debug:true,
      locale: Arabic,
      autoProceed: false,
    })
    
      .use(Webcam).use(GoogleDrive, {
        companionUrl: COMPANION_URL,
        companionAllowedHosts: COMPANION_ALLOWED_HOSTS,
      }).use(DropBox, {
        companionUrl: COMPANION_URL,
        companionAllowedHosts: COMPANION_ALLOWED_HOSTS,
      }).use(OneDrive, {
        companionUrl: COMPANION_URL,
        companionAllowedHosts: COMPANION_ALLOWED_HOSTS,
      }).use(AwsS3, {
        async getUploadParameters(file) {
          // Fetch presigned URL from the backend
          const response = await axios.post(s3PreSignedUrls.uploadAttachments, {
            fileName: file.name,
            fileType: file.type, 
            booking_id,
          });
          const { url, fields } = response.data;

          if (!url || !fields.key) {
            throw new Error('Invalid presigned URL or missing S3 key in response');
          }
          return {
            method: 'POST',
            url,
            fields,
          };
        },
      }).use(Transloadit, {
        alwaysRunAssembly: false,
        assemblyOptions: {
          params: {
            template_id: 'template_id',
            auth: {
              key: 'auth_key',
            },
         ]
          },
        },
      })
      
    uppyInstance.on('file-added', (file) => {

      const fileType = changeFileType(file);
      const fileMetadata = {
        booking_id: booking_id || '',
        file_name: file.name,
        type: fileType,
        url: file.preview || '',
      };
      uppyInstance.setFileMeta(file.id, fileMetadata);
    });

    uppyInstance.on('upload-success', async (file, response) => {
        // Send the S3 file URL to the backend
        const s3Url = response.uploadURL;
        await axios.post(UPLOAD_ATTACHMENTS, {
          type: file.meta.type,
          url: s3Url,
          file_name: file.name,
          booking_id,
          user_id
        }, {
          headers: {
            Authorization: `Bearer ${id_token}`,
          }
        });
      });

    uppyInstance.on('error', (error) => {
      console.error('Upload error:', error);    });
    setUppy(uppyInstance);

    // Cleanup on unmount
    return () => {
      uppyInstance.clear();
    };
  }, [id_token, booking_id]);

And here is my s3/store instructions

{
  "steps": {
    ":original": {
      "robot": "/upload/handle"
    },
    "exported": {
      "use": ":original",
      "robot": "/s3/store",
      "credentials": "AWS_Credentials"
    }
  }
}

Is there an issue with how Transloadit handles files from remote sources like Google Drive?
Do I need additional configuration for the /s3/store robot to work with remote files?

Any advice or recommendations for debugging this issue would be greatly appreciated.

Hi, first of all I would recommend correctly integrating Uppy with React. You should not put it inside useEffect. See the docs.

Regarding the error, are you sure your credentials are correct?

Looking closer at your code, you can’t use Transloadit and AWS. That’s very likely the problem.

Thank you! I’ll revisit the docs to ensure my implementation aligns with it.

Regarding the error, the AWS S3 credentials are correct, when I upload a local file, it successfully stores the file in S3. However, when I try uploading a file through remote storage (e.g., Google Drive), the getUploadParameters function doesn’t get called at all.

I’m not sure if this is expected behavior when dealing with remote files, or if there’s something specific I need to configure for Uppy or Transloadit to handle such cases

Based on what the Transloadit documentation outlines, it seems that the @uppy/transloadit plugin is designed to handle this exact use case, uploading files from sources like Google Drive, processing them (if needed), and then storing them into a storage choice of my own

I’m unsure how to properly configure Transloadit to handle cases like uploading files from remote storage providers (e.g., Google Drive) and directly storing them in S3

How can I handle such case? Thank you

After removing the AWS S3 plugin code and relying solely on the /s3/store robot, the files from remote storage are being stored in S3 now. Thank you!

1 Like