archiver vs zip-stream vs jszip vs adm-zip vs yazl vs zip-lib
JavaScript ZIP Libraries for File Compression and Archiving
archiverzip-streamjszipadm-zipyazlzip-libSimilar Packages:
JavaScript ZIP Libraries for File Compression and Archiving

adm-zip, archiver, jszip, yazl, zip-lib, and zip-stream are JavaScript libraries for creating, reading, and manipulating ZIP archives. They differ significantly in their architecture: some are designed for in-browser use, others for Node.js; some support streaming for memory efficiency, while others load entire archives into memory; and not all support both creation and extraction. These differences make each library better suited for specific scenarios, such as serving dynamic ZIPs from a server, processing large files without memory spikes, or enabling ZIP manipulation directly in a web application.

Npm Package Weekly Downloads Trend
3 Years
Github Stars Ranking
Stat Detail
Package
Downloads
Stars
Size
Issues
Publish
License
archiver15,140,8132,93243.1 kB1552 years agoMIT
zip-stream15,080,6361669.33 kB26a year agoMIT
jszip14,866,20010,265762 kB409-(MIT OR GPL-3.0-or-later)
adm-zip8,722,7512,149121 kB149a year agoMIT
yazl1,880,38137258.7 kB19a year agoMIT
zip-lib88,8304051.9 kB39 months agoMIT

JavaScript ZIP Libraries Compared: adm-zip, archiver, jszip, yazl, zip-lib, and zip-stream

When you need to create or extract ZIP files in a JavaScript environment — whether in Node.js or the browser — choosing the right library can make a big difference in performance, memory use, and code clarity. The six packages under review (adm-zip, archiver, jszip, yazl, zip-lib, and zip-stream) all handle ZIP operations but with very different design goals, APIs, and trade-offs. Let’s break down what each does well and where it falls short.

📦 Core Capabilities: Creation vs Extraction vs Streaming

adm-zip: Full-featured ZIP reader/writer for Node.js

adm-zip supports both reading and writing ZIP files, including extracting entries to disk and adding files from buffers or paths. It loads the entire archive into memory, which makes it simple but unsuitable for large files.

// Read and extract
const zip = new AdmZip("./archive.zip");
zip.extractAllTo("./output/");

// Add file and write
zip.addFile("new.txt", Buffer.from("Hello"));
zip.writeZip("./updated.zip");

archiver: High-level streaming ZIP (and TAR) creator

archiver is built on streams and excels at creating ZIPs incrementally without loading everything into memory. It supports compression, directory recursion, and piping to any writable stream (like HTTP responses). However, it cannot read or extract ZIPs.

const archive = archiver('zip');
const output = fs.createWriteStream('archive.zip');

archive.pipe(output);
archive.file('file.txt', { name: 'file.txt' });
archive.finalize();

jszip: Browser-first ZIP manipulation with limited Node support

jszip works well in browsers and can load, modify, and generate ZIPs from memory. In Node.js, it requires additional setup (e.g., using fs.promises to read files as buffers). It doesn’t support streaming and holds everything in RAM.

// Load and add
const zip = await JSZip.loadAsync(fs.readFileSync('input.zip'));
zip.file('new.txt', 'Hello');
const buffer = await zip.generateAsync({ type: 'nodebuffer' });
fs.writeFileSync('output.zip', buffer);

yazl: Low-level, streaming ZIP generator only

yazl (Yet Another Zip Library) is a minimal, streaming-only ZIP writer. It gives fine control over entry metadata and compression but provides no extraction capability. Ideal when you need predictable memory usage and full control over the ZIP structure.

const zipfile = new yazl.ZipFile();
zipfile.addBuffer(Buffer.from('Hello'), 'hello.txt');
zipfile.end();
zipfile.outputStream.pipe(fs.createWriteStream('out.zip'));

zip-lib: Simple promise-based ZIP utility (Node.js only)

zip-lib wraps lower-level libraries to offer a clean promise API for basic tasks like compressing directories or decompressing archives. It’s easy to use but lacks advanced features like streaming or fine-grained entry control.

// Compress a folder
await zipLib.compress('./folder', './archive.zip');

// Extract
await zipLib.extract('./archive.zip', './output');

zip-stream: Barebones streaming ZIP core (used by archiver)

zip-stream is the underlying engine that powers archiver’s ZIP functionality. It’s a transform stream that converts input entries into ZIP format. You typically won’t use it directly unless you’re building your own archiving tool.

const zip = new ZipStream();
const output = fs.createWriteStream('out.zip');

zip.pipe(output);
zip.entry('Hello', { name: 'hello.txt' }, () => {
  zip.finish();
});

⚖️ Memory and Performance Trade-offs

  • Streaming vs In-Memory: archiver, yazl, and zip-stream use streams and scale to large datasets. adm-zip, jszip, and zip-lib load everything into memory — avoid them for files >100MB.
  • Extraction Support: Only adm-zip, jszip, and zip-lib can extract ZIPs. If you need to read archives, rule out archiver, yazl, and zip-stream.
  • Browser Compatibility: Only jszip is designed for the browser. The others are Node.js-only (they rely on fs, stream, or native modules).

🔧 Real-World Use Cases

Need to serve dynamic ZIPs from an Express route?

Use archiver — it streams directly to the HTTP response without buffering.

app.get('/download', (req, res) => {
  res.setHeader('Content-Type', 'application/zip');
  const archive = archiver('zip');
  archive.pipe(res);
  archive.file('report.pdf', { name: 'report.pdf' });
  archive.finalize();
});

Building a web app that lets users upload and inspect ZIP contents?

Use jszip — it runs in the browser and handles in-memory archives cleanly.

// In browser
const arrayBuffer = await file.arrayBuffer();
const zip = await JSZip.loadAsync(arrayBuffer);
const text = await zip.file('readme.txt').async('text');

Processing massive log directories into ZIPs on a server?

Use yazl or archiver — both stream and avoid memory spikes.

Just need to zip/unzip a config folder in a CLI tool?

Use zip-lib — its promise API keeps your script concise.

🚫 Deprecated or Limited Packages?

As of 2024:

  • adm-zip is actively maintained but has known security issues in older versions; always use the latest release.
  • zip-lib appears minimally maintained but still functional for basic tasks.
  • None of the listed packages are officially deprecated, but yazl and zip-stream are low-level tools best used indirectly via archiver unless you need their specific control.

📊 Summary Table

PackageCreate ZIPExtract ZIPStreamingBrowserBest For
adm-zipSimple Node scripts needing full read/write
archiverServer-side dynamic ZIP generation
jszipBrowser-based ZIP manipulation
yazlLow-level, high-control ZIP writing
zip-libQuick CLI zipping/unzipping
zip-streamBuilding custom archiving pipelines

💡 Final Guidance

  • For most Node.js servers: Start with archiver — it’s robust, streaming, and well-documented.
  • For browser apps: jszip is your only realistic choice.
  • Avoid in-memory libraries (adm-zip, jszip, zip-lib) when handling user-uploaded or large files.
  • Don’t use yazl or zip-stream directly unless you’re replacing archiver for a specific reason (e.g., bundle size or custom logic).

Choose based on your environment (Node vs browser), data size (small vs large), and whether you need to read, write, or both.

How to Choose: archiver vs zip-stream vs jszip vs adm-zip vs yazl vs zip-lib
  • archiver:

    Choose archiver when you need to generate ZIP files dynamically in Node.js with streaming support—ideal for web servers that must pipe ZIPs directly to HTTP responses without buffering everything in memory. Note that it cannot extract or read existing ZIPs.

  • zip-stream:

    Choose zip-stream only if you're building your own archiving pipeline and need the raw streaming ZIP engine that powers archiver. For almost all practical purposes, archiver is a more complete and user-friendly alternative.

  • jszip:

    Choose jszip for browser-based applications where users need to upload, inspect, or download ZIP files entirely client-side. It also works in Node.js but requires manual file I/O and isn't suitable for large files due to its in-memory model.

  • adm-zip:

    Choose adm-zip if you're working in a Node.js environment and need a straightforward, synchronous API for both reading and writing small ZIP files entirely in memory. Avoid it for large archives or streaming scenarios due to its memory footprint.

  • yazl:

    Choose yazl only if you require fine-grained, low-level control over ZIP creation in Node.js with streaming output and want to avoid higher-level abstractions. It’s a good fit for custom tooling but overkill for typical use cases already covered by archiver.

  • zip-lib:

    Choose zip-lib for simple command-line or scripting tasks in Node.js where you need a clean promise-based API to compress folders or extract archives quickly. It’s not suitable for streaming, large files, or browser environments.

README for archiver

Archiver

A streaming interface for archive generation

Visit the API documentation for a list of all methods available.

Install

npm install archiver --save

Quick Start

// require modules
const fs = require('fs');
const archiver = require('archiver');

// create a file to stream archive data to.
const output = fs.createWriteStream(__dirname + '/example.zip');
const archive = archiver('zip', {
  zlib: { level: 9 } // Sets the compression level.
});

// listen for all archive data to be written
// 'close' event is fired only when a file descriptor is involved
output.on('close', function() {
  console.log(archive.pointer() + ' total bytes');
  console.log('archiver has been finalized and the output file descriptor has closed.');
});

// This event is fired when the data source is drained no matter what was the data source.
// It is not part of this library but rather from the NodeJS Stream API.
// @see: https://nodejs.org/api/stream.html#stream_event_end
output.on('end', function() {
  console.log('Data has been drained');
});

// good practice to catch warnings (ie stat failures and other non-blocking errors)
archive.on('warning', function(err) {
  if (err.code === 'ENOENT') {
    // log warning
  } else {
    // throw error
    throw err;
  }
});

// good practice to catch this error explicitly
archive.on('error', function(err) {
  throw err;
});

// pipe archive data to the file
archive.pipe(output);

// append a file from stream
const file1 = __dirname + '/file1.txt';
archive.append(fs.createReadStream(file1), { name: 'file1.txt' });

// append a file from string
archive.append('string cheese!', { name: 'file2.txt' });

// append a file from buffer
const buffer3 = Buffer.from('buff it!');
archive.append(buffer3, { name: 'file3.txt' });

// append a file
archive.file('file1.txt', { name: 'file4.txt' });

// append files from a sub-directory and naming it `new-subdir` within the archive
archive.directory('subdir/', 'new-subdir');

// append files from a sub-directory, putting its contents at the root of archive
archive.directory('subdir/', false);

// append files from a glob pattern
archive.glob('file*.txt', {cwd:__dirname});

// finalize the archive (ie we are done appending files but streams have to finish yet)
// 'close', 'end' or 'finish' may be fired right after calling this method so register to them beforehand
archive.finalize();

Formats

Archiver ships with out of the box support for TAR and ZIP archives.

You can register additional formats with registerFormat.

You can check if format already exists before to register a new one with isRegisteredFormat.