Using Uppy + Transloadit as a Companion for Remote Storage (Google Drive) + AWS S3

I’m currently using Uppy alongside Transloadit to handle file uploads. My setup involves the following:

  1. Uppy for the frontend, configured with plugins for Webcam, GoogleDrive, and Dropbox as file sources.

  2. Companion (via Transloadit) to fetch files from remote storage providers like Google Drive.

  3. AWS S3 as the final destination for uploaded files.

I want to upload files directly to S3 without performing any additional processing on the files. Essentially, I want to skip any encoding or transformation steps and just store the raw files in S3.

When uploading files from Google Drive:

• The assembly starts but remains in the ASSEMBLY_UPLOADING state.

• I’m seeing a warning:

s3: bucket key must be a string or a function resolving the bucket string

Uploading local files works as expected, but files from remote storage do not get uploaded to S3.

  useEffect(() => {
    const uppyInstance = new Uppy({
      restrictions: {
        maxFileSize: 10485760,
        maxNumberOfFiles: 10,
        allowedFileTypes: ['image/*', 'video/*', 'application/*'],
      },
      debug:true,
      locale: Arabic,
      autoProceed: false,
    })
    
      .use(Webcam).use(GoogleDrive, {
        companionUrl: COMPANION_URL,
        companionAllowedHosts: COMPANION_ALLOWED_HOSTS,
      }).use(DropBox, {
        companionUrl: COMPANION_URL,
        companionAllowedHosts: COMPANION_ALLOWED_HOSTS,
      }).use(OneDrive, {
        companionUrl: COMPANION_URL,
        companionAllowedHosts: COMPANION_ALLOWED_HOSTS,
      }).use(AwsS3, {
        async getUploadParameters(file) {
          // Fetch presigned URL from the backend
          const response = await axios.post(s3PreSignedUrls.uploadAttachments, {
            fileName: file.name,
            fileType: file.type, 
            booking_id,
          });
          const { url, fields } = response.data;

          if (!url || !fields.key) {
            throw new Error('Invalid presigned URL or missing S3 key in response');
          }
          return {
            method: 'POST',
            url,
            fields,
          };
        },
      }).use(Transloadit, {
        alwaysRunAssembly: false,
        assemblyOptions: {
          params: {
            template_id: 'template_id',
            auth: {
              key: 'auth_key',
            },
         ]
          },
        },
      })
      
    uppyInstance.on('file-added', (file) => {

      const fileType = changeFileType(file);
      const fileMetadata = {
        booking_id: booking_id || '',
        file_name: file.name,
        type: fileType,
        url: file.preview || '',
      };
      uppyInstance.setFileMeta(file.id, fileMetadata);
    });

    uppyInstance.on('upload-success', async (file, response) => {
        // Send the S3 file URL to the backend
        const s3Url = response.uploadURL;
        await axios.post(UPLOAD_ATTACHMENTS, {
          type: file.meta.type,
          url: s3Url,
          file_name: file.name,
          booking_id,
          user_id
        }, {
          headers: {
            Authorization: `Bearer ${id_token}`,
          }
        });
      });

    uppyInstance.on('error', (error) => {
      console.error('Upload error:', error);    });
    setUppy(uppyInstance);

    // Cleanup on unmount
    return () => {
      uppyInstance.clear();
    };
  }, [id_token, booking_id]);

And here is my s3/store instructions

{
  "steps": {
    ":original": {
      "robot": "/upload/handle"
    },
    "exported": {
      "use": ":original",
      "robot": "/s3/store",
      "credentials": "AWS_Credentials"
    }
  }
}

Can Transloadit + Uppy be used purely for uploading files to S3 without any additional processing?