Detecting the upload as a whole


I’m currently developing an upload http API using uppy and I’m at the stage where I can select some files on the front end and store these in the backend, along with some custom metadata for each file.

I need to do some bookkeeping whereby I create a single item in a database for each upload session (when they have clicked the upload button) regardless of how many files have been selected. For example, create a single entry in an “uploads” table with the total size of the files the user has uploaded in that session as well as a session title and description that come across in every single upload.

I’m using the tus server and following the docs, I’ve played around with:

  • onUploadCreate
  • onUploadFinish

But all of these seem to be called for each file. Is there any way I can get a summary of what’s been uploaded once all the uploads have finished via a callback that’s only called once?

If not, I’m thinking I may have to split this up into call separate calls, one for the metadata and one for the uploads but happy to hear any other suggestions.


Hi, the idea of a “summary of what’s been uploaded once all the uploads have finished” is not something that can be deterministically as of now. Uppy will start an upload to tus per file, the server doesn’t know how many files it’s supposed to get from this single upload sessions of a single user.

I’m afraid that’s something you’re going to have to keep track of yourself. Probably through some sort of session ID you send along as metadata from the client.

Thanks for getting back. No worries, i thought this might be the case.


If you find an elegant solution maybe we can add it as an example to @tus/server readme.

1 Like

Will do, and if you know of any examples doing this (non-elegant is fine!), please shout.

So working on this has produced an interesting problem. I’m using the Node.js client

I settled on a process that, when the upload comes in, I check that sessionId against a MongoDB to see whether it exists. If it does, I get some more ids from it to process elsewhere, if it doesn’t, I create it. I’ve included the complete function call below.

While I can use await in the function to access the mongoDB, I can’t stop the onUploadCreate from firing before I’ve had the chance to process the first upload file, presumably because it’s async (I’m pretty new to JS!).

This means that I can’t capture any kind of state when uploading more than one file.

So I guess the question is, is there anyway I can process one upload at a time? or, at least, any other architectural way I can achieve this? I’m basically trying to add one entry in a db with a list of all files the user uploads when they click upload. I can’t be the first person to want to do this?

Any help or suggestions would be greatly appreciated.


async onUploadCreate(req, res, upload) {
    /* Check the metadata */
    const {ok, expected} = await common.validateMetadata(upload);
    if (!ok) {
        const body = `Expected "${expected}" in "Upload-Metadata" but received "${upload.metadata}"`;
        throw {status_code: 500, body};
    } else`Metadata is ok!`);

    /* Check the user is logged in */
    const userAuthorised = await catalogue.getSession(upload.metadata.userSession);
    if (!userAuthorised) {
        const body = `User not authorised`;`User with session id: "${upload.metadata.userSession}" not authed`);
        throw {status_code: 401, body};
    } else`User with session id: "${upload.metadata.userSession}" is logged in`);

    /* Create the items in the catalogue and mongo db for tracking upload session*/
    // check the mongo db for existing uploadId
    const result =  await mongo.getUploadIdInDB(upload.metadata.uploadId);
    if (result == null) {`Upload ID: "${upload.metadata.uploadId}" not found in mongodb`)
        // create a dataset in catalogue
        let dataset = await catalogue.createDataset(upload.metadata);
        // then create the datafiles in catalogue
        let datafile = await catalogue.createDatafile(dataset,, upload.size, upload.metadata);
        // then add the id in mongo
        await mongo.createUpload(upload.metadata, dataset, datafile)
    else {`Upload ID: "${upload.metadata.uploadId}" found in mongodb!`)
        // create a new datafile to go this dataset
        let datafile = await catalogue.createDatafile(result.datasetId[0],, upload.size, upload.metadata);
        // then add the id of the nre datafile to the upload entry in mongo
        await mongo.appendToUpload( result.datasetId, datafile)
    }`Done processing upload file`)
    return res;
1 Like

ok, in a slightly less panicked state, I’ve got around this by dissolving the app’s responsibility to store any state.


Aw man, it sounds like the solution is to give up.

I’m currently trying to upload files to Google Drive once they’re uploaded to my server, but only once per file. When I upload 3 files using tusServer.on(EVENTS.POST_FINISH, doSomething) it fires doSomething 9 times (3 per each 3 files I’m trying to upload) and I get 9 new files in the Google Drive folder. :man_facepalming:

POST_FINISH should only be called once per file. If you can reproduce this please file an issue.