archiver vs decompress-tar vs tar vs tar-fs vs tar-stream
Node.js Archive and Compression Libraries
archiverdecompress-tartartar-fstar-streamSimilar Packages:

Node.js Archive and Compression Libraries

These libraries are designed to handle file archiving and compression in Node.js applications. They provide various functionalities for creating and extracting archive files, such as .zip and .tar formats. Each library has its own strengths and use cases, catering to different needs in file management and data transfer, making them essential tools for developers working with file systems and data packaging.

Npm Package Weekly Downloads Trend

3 Years

Github Stars Ranking

Stat Detail

Package
Downloads
Stars
Size
Issues
Publish
License
archiver02,94543.1 kB1552 years agoMIT
decompress-tar016-109 years agoMIT
tar09042.25 MB109 days agoBlueOak-1.0.0
tar-fs037817.5 kB15 months agoMIT
tar-stream043632 kB172 years agoMIT

Feature Comparison: archiver vs decompress-tar vs tar vs tar-fs vs tar-stream

Archive Format Support

  • archiver:

    Archiver supports multiple archive formats, including zip and tar, making it a versatile choice for various applications that require different compression types.

  • decompress-tar:

    Decompress-tar is focused solely on the tar format, providing a simple and effective way to extract tar archives without additional complexity.

  • tar:

    Tar is specifically designed for handling tar files, offering a straightforward API for creating and extracting these archives efficiently.

  • tar-fs:

    Tar-fs is tailored for tar files, providing a streaming interface that allows for efficient handling of large tar archives without loading them entirely into memory.

  • tar-stream:

    Tar-stream supports creating and extracting tar files in a streaming manner, allowing for flexible processing of tar data.

Streaming Capabilities

  • archiver:

    Archiver provides robust streaming capabilities, allowing you to create archives on-the-fly and pipe the output directly to writable streams, which is beneficial for performance and memory management.

  • decompress-tar:

    Decompress-tar does not support streaming; it focuses on straightforward extraction of tar files, which may not be suitable for large files where streaming is beneficial.

  • tar:

    Tar offers basic streaming capabilities, enabling you to create and extract tar files, but it may not be as optimized for performance as other libraries.

  • tar-fs:

    Tar-fs excels in streaming, allowing you to create and extract tar files efficiently, making it ideal for large datasets and real-time processing.

  • tar-stream:

    Tar-stream is designed for streaming, enabling you to create and extract tar files in a way that minimizes memory usage and maximizes performance.

Ease of Use

  • archiver:

    Archiver is user-friendly with a rich API that simplifies the process of creating archives, making it easy for developers to implement without extensive boilerplate code.

  • decompress-tar:

    Decompress-tar is straightforward and easy to use, providing a simple interface for extracting tar files without unnecessary complexity.

  • tar:

    Tar has a minimalistic API, making it easy to use for basic tar operations, but it may require more manual handling for advanced features.

  • tar-fs:

    Tar-fs is designed with usability in mind, providing a clear API for both creating and extracting tar files, making it accessible for developers of all skill levels.

  • tar-stream:

    Tar-stream offers a flexible API that may require a bit more understanding of streams, but it provides powerful capabilities for those familiar with Node.js streams.

Performance

  • archiver:

    Archiver is optimized for performance, especially when dealing with large files or multiple files, as it streams data efficiently without excessive memory consumption.

  • decompress-tar:

    Decompress-tar performs well for extracting tar files but may not be as efficient as streaming libraries when handling very large archives.

  • tar:

    Tar is lightweight and performs well for basic operations, but it may not be optimized for high-performance scenarios compared to streaming options.

  • tar-fs:

    Tar-fs is highly performant, especially for large files, as it streams data directly to and from the filesystem, minimizing memory usage and improving speed.

  • tar-stream:

    Tar-stream is designed for performance, allowing for efficient processing of tar files in a streaming manner, which is ideal for large datasets.

Use Cases

  • archiver:

    Archiver is suitable for applications that require dynamic archive creation, such as web applications that need to generate downloadable zip files on-the-fly.

  • decompress-tar:

    Decompress-tar is best for applications that need to extract tar files, such as deployment scripts or backup restoration tools.

  • tar:

    Tar is ideal for basic file archiving tasks where minimal overhead is desired, such as command-line utilities or simple file management scripts.

  • tar-fs:

    Tar-fs is perfect for applications that need to handle large tar files efficiently, such as data processing pipelines or backup systems.

  • tar-stream:

    Tar-stream is well-suited for applications that require real-time processing of tar data, such as streaming file uploads or network data transfers.

How to Choose: archiver vs decompress-tar vs tar vs tar-fs vs tar-stream

  • archiver:

    Choose Archiver if you need a versatile library that supports multiple archive formats (like zip and tar) and offers streaming capabilities. It's ideal for creating archives dynamically, especially when dealing with large files or when you want to pipe the output directly to a response in a web application.

  • decompress-tar:

    Select Decompress-tar when you specifically need to extract .tar files. This library is straightforward and efficient for decompressing tar archives, making it a good choice for applications that primarily work with this format without additional overhead.

  • tar:

    Opt for Tar if you want a low-level library that provides a simple API for creating and extracting tar files. It is lightweight and suitable for developers who need granular control over the tar process without additional features that may not be necessary for their use case.

  • tar-fs:

    Use Tar-fs if you require a streaming interface for both creating and extracting tar files. This library is particularly useful for working with streams, allowing you to handle large files efficiently without loading them entirely into memory, making it suitable for performance-sensitive applications.

  • tar-stream:

    Choose Tar-stream if you need a flexible and modular approach to working with tar archives. It allows for both creating and extracting tar files in a streaming manner, making it ideal for applications that need to process data on-the-fly without intermediate storage.

README for archiver

Archiver

A streaming interface for archive generation

Visit the API documentation for a list of all methods available.

Install

npm install archiver --save

Quick Start

// require modules
const fs = require('fs');
const archiver = require('archiver');

// create a file to stream archive data to.
const output = fs.createWriteStream(__dirname + '/example.zip');
const archive = archiver('zip', {
  zlib: { level: 9 } // Sets the compression level.
});

// listen for all archive data to be written
// 'close' event is fired only when a file descriptor is involved
output.on('close', function() {
  console.log(archive.pointer() + ' total bytes');
  console.log('archiver has been finalized and the output file descriptor has closed.');
});

// This event is fired when the data source is drained no matter what was the data source.
// It is not part of this library but rather from the NodeJS Stream API.
// @see: https://nodejs.org/api/stream.html#stream_event_end
output.on('end', function() {
  console.log('Data has been drained');
});

// good practice to catch warnings (ie stat failures and other non-blocking errors)
archive.on('warning', function(err) {
  if (err.code === 'ENOENT') {
    // log warning
  } else {
    // throw error
    throw err;
  }
});

// good practice to catch this error explicitly
archive.on('error', function(err) {
  throw err;
});

// pipe archive data to the file
archive.pipe(output);

// append a file from stream
const file1 = __dirname + '/file1.txt';
archive.append(fs.createReadStream(file1), { name: 'file1.txt' });

// append a file from string
archive.append('string cheese!', { name: 'file2.txt' });

// append a file from buffer
const buffer3 = Buffer.from('buff it!');
archive.append(buffer3, { name: 'file3.txt' });

// append a file
archive.file('file1.txt', { name: 'file4.txt' });

// append files from a sub-directory and naming it `new-subdir` within the archive
archive.directory('subdir/', 'new-subdir');

// append files from a sub-directory, putting its contents at the root of archive
archive.directory('subdir/', false);

// append files from a glob pattern
archive.glob('file*.txt', {cwd:__dirname});

// finalize the archive (ie we are done appending files but streams have to finish yet)
// 'close', 'end' or 'finish' may be fired right after calling this method so register to them beforehand
archive.finalize();

Formats

Archiver ships with out of the box support for TAR and ZIP archives.

You can register additional formats with registerFormat.

You can check if format already exists before to register a new one with isRegisteredFormat.