@fastify/multipart vs busboy vs formidable vs multer
Node.js File Upload Libraries
@fastify/multipartbusboyformidablemulterSimilar Packages:

Node.js File Upload Libraries

File upload libraries in Node.js facilitate the handling of multipart/form-data, which is commonly used for uploading files. These libraries provide various functionalities such as parsing incoming request data, managing file streams, and handling file storage. They are essential for building applications that require file uploads, ensuring that developers can efficiently manage file data while adhering to best practices for security and performance.

Npm Package Weekly Downloads Trend

3 Years

Github Stars Ranking

Stat Detail

Package
Downloads
Stars
Size
Issues
Publish
License
@fastify/multipart0540173 kB222 months agoMIT
busboy02,997124 kB37--
formidable0-204 kB-a year agoMIT
multer012,02531.5 kB24723 days agoMIT

Feature Comparison: @fastify/multipart vs busboy vs formidable vs multer

Performance

  • @fastify/multipart:

    @fastify/multipart is optimized for performance, leveraging Fastify's asynchronous architecture to handle file uploads efficiently. It minimizes overhead and maximizes throughput, making it suitable for applications with high file upload demands.

  • busboy:

    Busboy is lightweight and designed for streaming, which allows it to handle file uploads with minimal memory usage. Its performance is excellent for parsing large files as it processes data in chunks rather than loading everything into memory at once.

  • formidable:

    Formidable provides a good balance between performance and ease of use. While it may not be as fast as busboy or @fastify/multipart, it offers a comprehensive feature set that can be advantageous for more complex file handling scenarios.

  • multer:

    Multer is built on top of busboy and provides a middleware layer for Express.js applications. While it may introduce some overhead compared to busboy directly, its ease of integration with Express makes it a popular choice for many developers.

Ease of Use

  • @fastify/multipart:

    @fastify/multipart is designed to be straightforward for Fastify users, providing a simple API that integrates well with Fastify's request lifecycle. Its documentation is clear, making it easy to implement file uploads in Fastify applications.

  • busboy:

    Busboy requires more manual setup and handling compared to higher-level libraries. While it offers flexibility, developers may need to write more boilerplate code to manage file uploads effectively.

  • formidable:

    Formidable is user-friendly and provides a high-level API that abstracts much of the complexity involved in file uploads. Its built-in features for file renaming and storage make it easy to use for developers of all skill levels.

  • multer:

    Multer is very easy to set up and use within Express applications. Its middleware approach allows developers to quickly configure file upload handling with minimal code, making it a go-to choice for many Express.js developers.

File Handling Features

  • @fastify/multipart:

    @fastify/multipart supports various file handling features, including file size limits, file type validation, and streaming uploads. It allows developers to customize the handling of uploaded files to fit their needs.

  • busboy:

    Busboy provides low-level access to file streams, allowing developers to implement custom file handling logic. However, it does not provide built-in features for file validation or storage, requiring developers to implement these functionalities themselves.

  • formidable:

    Formidable offers a rich set of features, including file renaming, automatic file storage, and customizable upload paths. It also supports handling of both files and fields in a single request, making it versatile for complex forms.

  • multer:

    Multer provides built-in support for file size limits and file type filtering, allowing developers to easily enforce upload constraints. It also supports multiple file uploads and can be configured to store files in memory or on disk.

Integration

  • @fastify/multipart:

    @fastify/multipart is specifically designed for Fastify, ensuring seamless integration with its ecosystem. It takes advantage of Fastify's lifecycle hooks and asynchronous capabilities for optimal performance.

  • busboy:

    Busboy is a standalone library and can be integrated into any Node.js application. However, it requires more manual setup to work with frameworks like Express or Koa, as it does not provide built-in middleware.

  • formidable:

    Formidable can be used in any Node.js application, but it is particularly well-suited for Express due to its straightforward API. It requires minimal configuration to get started and works well with existing Express middleware.

  • multer:

    Multer is specifically designed for use with Express.js, making it easy to integrate as middleware. It fits naturally into the Express request handling flow, allowing for quick setup and configuration.

Community and Support

  • @fastify/multipart:

    @fastify/multipart benefits from the growing Fastify community, which provides active support and a wealth of resources for developers. Its documentation is comprehensive and regularly updated.

  • busboy:

    Busboy has a smaller community compared to some other libraries, but it is well-maintained and has been widely adopted in the Node.js ecosystem. Documentation is available, but may not be as extensive as other libraries.

  • formidable:

    Formidable has been around for a long time and has a strong user base. It offers good documentation and community support, making it a reliable choice for developers seeking assistance.

  • multer:

    Multer enjoys a large community due to its popularity within the Express ecosystem. It has extensive documentation and numerous tutorials available, making it easy for developers to find help and resources.

How to Choose: @fastify/multipart vs busboy vs formidable vs multer

  • @fastify/multipart:

    Choose @fastify/multipart if you are using the Fastify framework and need a high-performance solution that integrates seamlessly with Fastify's ecosystem. It is designed for speed and efficiency, making it ideal for applications that require handling large file uploads.

  • busboy:

    Choose busboy if you are looking for a lightweight and low-level streaming parser for multipart form data. It is suitable for applications where you want fine-grained control over file uploads and processing, and you don't need additional features that come with higher-level libraries.

  • formidable:

    Choose formidable if you need a robust solution that supports file uploads and form parsing with a focus on ease of use. It provides a comprehensive set of features, including file renaming and storage options, making it a good choice for applications that require more than just basic file handling.

  • multer:

    Choose multer if you are using Express.js and need a middleware for handling multipart/form-data. It is easy to set up and provides a simple API for configuring file storage options, making it ideal for applications that require quick and straightforward file upload functionality.

README for @fastify/multipart

@fastify/multipart

CI NPM version neostandard javascript style

Fastify plugin to parse the multipart content-type. Supports:

  • Async / Await
  • Async iterator support to handle multiple parts
  • Stream & Disk mode
  • Accumulating the entire file in memory
  • Mode to attach all fields to the request body
  • Tested across Linux/Mac/Windows

Under the hood, it uses @fastify/busboy.

Install

npm i @fastify/multipart

Usage

const fastify = require('fastify')()
const fs = require('node:fs')
const { pipeline } = require('node:stream/promises')

fastify.register(require('@fastify/multipart'))

fastify.post('/', async function (req, reply) {
  // process a single file
  // also, consider that if you allow to upload multiple files
  // you must consume all files otherwise the promise will never fulfill
  const data = await req.file()

  data.file // stream
  data.fields // other parsed parts
  data.fieldname
  data.filename
  data.encoding
  data.mimetype

  // to accumulate the file in memory! Be careful!
  //
  // await data.toBuffer() // Buffer
  //
  // or

  await pipeline(data.file, fs.createWriteStream(data.filename))

  // be careful of permission issues on disk and not overwrite
  // sensitive files that could cause security risks

  // also, consider that if the file stream is not consumed, the promise will never fulfill

  reply.send()
})

fastify.listen({ port: 3000 }, err => {
  if (err) throw err
  console.log(`server listening on ${fastify.server.address().port}`)
})

Note about data.fields: busboy consumes the multipart in serial order (stream). Therefore, the order of form fields is VERY IMPORTANT to how @fastify/multipart can display the fields to you. We would recommend you place the value fields first before any of the file fields. It will ensure your fields are accessible before it starts consuming any files. If you cannot control the order of the placed fields, be sure to read data.fields AFTER consuming the stream, or it will only contain the fields parsed at that moment.

You can also pass optional arguments to @fastify/busboy when registering with Fastify. This is useful for setting limits on the content that can be uploaded. A full list of available options can be found in the @fastify/busboy documentation.

fastify.register(require('@fastify/multipart'), {
  limits: {
    fieldNameSize: 100, // Max field name size in bytes
    fieldSize: 100,     // Max field value size in bytes
    fields: 10,         // Max number of non-file fields
    fileSize: 1000000,  // For multipart forms, the max file size in bytes
    files: 1,           // Max number of file fields
    headerPairs: 2000,  // Max number of header key=>value pairs
    parts: 1000         // For multipart forms, the max number of parts (fields + files)
  }
});

For security reasons, @fastify/multipart sets the limit for parts and fileSize being 1000 and 1048576 respectively.

Note: if the file stream that is provided by data.file is not consumed, like in the example below with the usage of pipeline, the promise will not be fulfilled at the end of the multipart processing. This behavior is inherited from @fastify/busboy.

Note: if you set a fileSize limit and you want to know if the file limit was reached you can:

  • listen to data.file.on('limit')
  • or check at the end of the stream the property data.file.truncated
  • or call data.file.toBuffer() and wait for the error to be thrown
const data = await req.file()
await pipeline(data.file, fs.createWriteStream(data.filename))
if (data.file.truncated) {
  // you may need to delete the part of the file that has been saved on disk
  // before the `limits.fileSize` has been reached
  reply.send(new fastify.multipartErrors.FilesLimitError());
}

// OR
const data = await req.file()
try {
  const buffer = await data.toBuffer()
} catch (err) {
  // fileSize limit reached!
}

Additionally, you can pass per-request options to the req.file, req.files, req.saveRequestFiles or req.parts function.

fastify.post('/', async function (req, reply) {
  const options = { limits: { fileSize: 1000 } };
  const data = await req.file(options)
  await pipeline(data.file, fs.createWriteStream(data.filename))
  reply.send()
})

Or to a route options when attachFieldsToBody is used.

fastify.post('/', {
  config: {
      multipartOptions: {
        limits: { fileSize: 1000 }
      }
    }
}, async function (req, reply) {
  const buffer = req.body.file.toBuffer();
  reply.send()
})

Handle multiple file streams

fastify.post('/', async function (req, reply) {
  const parts = req.files()
  for await (const part of parts) {
    await pipeline(part.file, fs.createWriteStream(part.filename))
  }
  reply.send()
})

Handle multiple file streams and fields

fastify.post('/upload/raw/any', async function (req, reply) {
  const parts = req.parts()
  for await (const part of parts) {
    if (part.type === 'file') {
      await pipeline(part.file, fs.createWriteStream(part.filename))
    } else {
      // part.type === 'field
      console.log(part)
    }
  }
  reply.send()
})

Accumulating the entire file in memory

fastify.post('/upload/raw/any', async function (req, reply) {
  const data = await req.file()
  const buffer = await data.toBuffer()
  // upload to S3
  reply.send()
})

Upload files to disk and work with temporary file paths

This will store all files in the operating system's default directory for temporary files. As soon as the response ends all files are removed.

fastify.post('/upload/files', async function (req, reply) {
  // stores files to tmp dir and return files
  const files = await req.saveRequestFiles()
  files[0].type // "file"
  files[0].filepath
  files[0].fieldname
  files[0].filename
  files[0].encoding
  files[0].mimetype
  files[0].fields // other parsed parts

  reply.send()
})

Handle file size limitation

If you set a fileSize limit, it can throw a RequestFileTooLargeError error when limit reached.

fastify.post('/upload/files', async function (req, reply) {
  try {
    const file = await req.file({ limits: { fileSize: 17000 } })
    //const files = req.files({ limits: { fileSize: 17000 } })
    //const parts = req.parts({ limits: { fileSize: 17000 } })
    //const files = await req.saveRequestFiles({ limits: { fileSize: 17000 } })
    reply.send()
  } catch (error) {
    // error instanceof fastify.multipartErrors.RequestFileTooLargeError
  }
})

If you want to fallback to the handling before 4.0.0, you can disable the throwing behavior by passing throwFileSizeLimit. Note: It will not affect the behavior of saveRequestFiles()

// globally disable
fastify.register(fastifyMultipart, { throwFileSizeLimit: false })

fastify.post('/upload/file', async function (req, reply) {
  const file = await req.file({ throwFileSizeLimit: false, limits: { fileSize: 17000 } })
  //const files = req.files({ throwFileSizeLimit: false, limits: { fileSize: 17000 } })
  //const parts = req.parts({ throwFileSizeLimit: false, limits: { fileSize: 17000 } })
  //const files = await req.saveRequestFiles({ throwFileSizeLimit: false, limits: { fileSize: 17000 } })
  reply.send()
})

Parse all fields and assign them to the body

This allows you to parse all fields automatically and assign them to the request.body. By default, files are accumulated in memory (Be careful!) to buffer objects. Uncaught errors are handled by Fastify.

fastify.register(require('@fastify/multipart'), { attachFieldsToBody: true })

fastify.post('/upload/files', async function (req, reply) {
  const uploadValue = await req.body.upload.toBuffer() // access files
  const fooValue = req.body.foo.value                  // other fields
  const body = Object.fromEntries(
    Object.keys(req.body).map((key) => [key, req.body[key].value])
  ) // Request body in key-value pairs, like req.body in Express (Node 20+)

  // On Node 20+
  const formData = await req.formData()
  console.log(formData)
})

Request body key-value pairs can be assigned directly using attachFieldsToBody: 'keyValues'. Field values, including file buffers, will be attached to the body object.

fastify.register(require('@fastify/multipart'), { attachFieldsToBody: 'keyValues' })

fastify.post('/upload/files', async function (req, reply) {
  const uploadValue = req.body.upload // access file as buffer
  const fooValue = req.body.foo       // other fields
})

You can also define an onFile handler to avoid accumulating all files in memory.

async function onFile(part) {
  // you have access to original request via `this`
  console.log(this.id)
  await pipeline(part.file, fs.createWriteStream(part.filename))
}

fastify.register(require('@fastify/multipart'), { attachFieldsToBody: true, onFile })

fastify.post('/upload/files', async function (req, reply) {
  const fooValue = req.body.foo.value // other fields
})

The onFile handler can also be used with attachFieldsToBody: 'keyValues' in order to specify how file buffer values are decoded.

async function onFile(part) {
  const buff = await part.toBuffer()
  const decoded = Buffer.from(buff.toString(), 'base64').toString()
  part.value = decoded // set `part.value` to specify the request body value
}

fastify.register(require('@fastify/multipart'), { attachFieldsToBody: 'keyValues', onFile })

fastify.post('/upload/files', async function (req, reply) {
  const uploadValue = req.body.upload // access file as base64 string
  const fooValue = req.body.foo       // other fields
})

Note: if you assign all fields to the body and don't define an onFile handler, you won't be able to read the files through streams, as they are already read and their contents are accumulated in memory. You can only use the toBuffer method to read the content. If you try to read from a stream and pipe to a new file, you will obtain an empty new file.

JSON Schema body validation

When the attachFieldsToBody parameter is set to 'keyValues', JSON Schema validation on the body will behave similarly to application/json and application/x-www-form-urlencoded content types. Additionally, uploaded files will be attached to the body as Buffer objects.

fastify.register(require('@fastify/multipart'), { attachFieldsToBody: 'keyValues' })

fastify.post('/upload/files', {
  schema: {
    consumes: ['multipart/form-data'],
    body: {
      type: 'object',
      required: ['myFile'],
      properties: {
        // file that gets decoded to string
        myFile: {
          type: 'object',
        },
        hello: {
          type: 'string',
          enum: ['world']
        }
      }
    }
  }
}, function (req, reply) {
  console.log({ body: req.body })
  reply.send('done')
})

If you enable attachFieldsToBody: true and set sharedSchemaId a shared JSON Schema is added, which can be used to validate parsed multipart fields.

const opts = {
  attachFieldsToBody: true,
  sharedSchemaId: '#mySharedSchema'
}
fastify.register(require('@fastify/multipart'), opts)

fastify.post('/upload/files', {
  schema: {
    consumes: ['multipart/form-data'],
    body: {
      type: 'object',
      required: ['myField'],
      properties: {
        // field that uses the shared schema
        myField: { $ref: '#mySharedSchema'},
        // or another field that uses the shared schema
        myFiles: { type: 'array', items: fastify.getSchema('mySharedSchema') },
        // or a field that doesn't use the shared schema
        hello: {
          properties: {
            value: {
              type: 'string',
              enum: ['male']
            }
          }
        }
      }
    }
  }
}, function (req, reply) {
  console.log({ body: req.body })
  reply.send('done')
})

If provided, the sharedSchemaId parameter must be a string ID and a shared schema will be added to your fastify instance so you will be able to apply the validation to your service (like in the example mentioned above).

The shared schema, that is added, will look like this:

{
  type: 'object',
  properties: {
    encoding: { type: 'string' },
    filename: { type: 'string' },
    limit: { type: 'boolean' },
    mimetype: { type: 'string' }
  }
}

JSON Schema with Swagger

If you want to use @fastify/multipart with @fastify/swagger and @fastify/swagger-ui you must add a new type called isFile and use a custom instance of a validator compiler Docs.


const fastify = require('fastify')({
 // ...
  ajv: {
    // Adds the file plugin to help @fastify/swagger schema generation
    plugins: [require('@fastify/multipart').ajvFilePlugin]
  }
})

fastify.register(require("@fastify/multipart"), {
  attachFieldsToBody: true,
});

fastify.post(
  "/upload/files",
  {
    schema: {
      consumes: ["multipart/form-data"],
      body: {
        type: "object",
        required: ["myField"],
        properties: {
          myField: { isFile: true },
        },
      },
    },
  },
  function (req, reply) {
    console.log({ body: req.body });
    reply.send("done");
  }
);

JSON Schema non-file field

When sending fields with the body (attachFieldsToBody set to true), the field might look like this in the request.body:

{
  "hello": "world"
}

The mentioned field will be converted, by this plugin, to a more complex field. The converted field will look something like this:

{
  hello: {
    fieldname: "hello",
    value: "world",
    fieldnameTruncated: false,
    valueTruncated: false,
    fields: body
  }
}

It is important to know that this conversion happens BEFORE the field is validated, so keep that in mind when writing the JSON schema for validation for fields that don't use the shared schema. The schema for validation for the field mentioned above should look like this:

hello: {
  properties: {
    value: {
      type: 'string'
    }
  }
}

JSON non-file fields

If a non-file field sent has Content-Type header starting with application/json, it will be parsed using JSON.parse.

The schema to validate JSON fields should look like this:

hello: {
  properties: {
    value: {
      type: 'object',
      properties: {
        /* ... */
      }
    }
  }
}

If you also use the shared JSON schema as shown above, this is a full example that validates the entire field:

const opts = {
  attachFieldsToBody: true,
  sharedSchemaId: '#mySharedSchema'
}
fastify.register(require('@fastify/multipart'), opts)

fastify.post('/upload/files', {
  schema: {
    consumes: ['multipart/form-data'],
    body: {
      type: 'object',
      required: ['field'],
      properties: {
        field: {
          allOf: [
            { $ref: '#mySharedSchema' },
            {
              properties: {
                value: {
                  type: 'object'
                  properties: {
                    child: {
                      type: 'string'
                    }
                  }
                }
              }
            }
          ]
        }
      }
    }
  }
}, function (req, reply) {
  console.log({ body: req.body })
  reply.send('done')
})

Zod Schema body validation

To validate requests using Zod, you need to:

  1. Install and configure fastify-type-provider-zod.
  2. Make sure the attachFieldsToBody option is set to true when registering the @fastify/multipart plugin.
  3. You can use attachFieldsToBody: "keyValues" to avoid another fields preprocessing, but in that case, you will receive a Buffer for files that are not text/plain.

After setup, you can validate your request body using a Zod schema as usual.

See a full example in examples/example-with-zod.ts.

Access all errors

We export all custom errors via a server decorator fastify.multipartErrors. This is useful if you want to react to specific errors. They are derived from @fastify/error and include the correct statusCode property.

fastify.post('/upload/files', async function (req, reply) {
  const { FilesLimitError } = fastify.multipartErrors
})

Acknowledgments

This project is kindly sponsored by:

License

Licensed under MIT.