Resume "broken" S3 multipart upload

I’m using Uppy to allow end users to upload large media files directly to S3 and R2 (Cloudflare’s S3 competitor). This works magnificently, I have Uppy create, presign and complete the uploads, it’s great.

There’s one problem, though: with large (I’ve consistently seen it fail on a 20GB file) uploads, R2 will sometimes cough up some 503’s (that’s fine, Uppy handles that transparently). It seems that, after throwing a couple of those, R2 starts throwing actual 500, on which Uppy stops the upload. I understand why that happens, but I was wondering about how to handle it.

I’ve tries Golden Retriever, but that doesn’t seem to work for me (and I’m not entirely sure it works with S3 multiparts at all), and adding the upload again naturally starts the upload from scratch.

Would it be possible to somehow “tell” Uppy (from whatever I get back from createMultipartUpload) that this is an upload it could resume (as if the end user paused it), or is that completely impossible due to browser restrictions? Would this be hard to achieve, and/or would it be something you’d want to see a PR for?

If there’s other ways to get around the block storage sometimes throwing 500’s at you, I’m all ears!

1 Like

I am also having same problem (mine is network disconnected). Hope someone replies with solution.

Hi! Unfortunately resume only works with Tus uploads for now. There’s an issue on github about this, you can follow the progress there: Resumable uploads with S3 Multipart · Issue #2121 · transloadit/uppy · GitHub