async vs bottleneck vs p-queue vs promise-queue vs queue-promise
Managing Asynchronous Task Queues and Rate Limiting in JavaScript
asyncbottleneckp-queuepromise-queuequeue-promiseSimilar Packages:

Managing Asynchronous Task Queues and Rate Limiting in JavaScript

These libraries provide mechanisms to control the execution order and concurrency of asynchronous tasks in JavaScript. While native Promises and async/await handle basic sequencing, complex applications often need to limit how many tasks run at once (concurrency), enforce delays between requests (rate limiting), or manage priority queues. async is a comprehensive utility belt with legacy callback support. bottleneck specializes in robust rate limiting and load balancing. p-queue offers a modern, Promise-native queue with concurrency limits. promise-queue and queue-promise are simpler, older implementations focused on basic FIFO queuing. Choosing the right one depends on whether you need rate limiting, complex priority management, or just a simple way to run tasks one by one.

Npm Package Weekly Downloads Trend

3 Years

Github Stars Ranking

Stat Detail

Package
Downloads
Stars
Size
Issues
Publish
License
async82,138,34028,181808 kB242 years agoMIT
bottleneck9,288,5961,982-917 years agoMIT
p-queue04,17880.5 kB63 days agoMIT
promise-queue0230-108 years agoMIT
queue-promise09229.2 kB13-MIT

Managing Asynchronous Task Queues and Rate Limiting in JavaScript

When building scalable applications, running all asynchronous tasks at once can crash your memory or get your IP banned by APIs. You need control. The packages async, bottleneck, p-queue, promise-queue, and queue-promise all solve this, but they approach the problem from different angles. Some focus on raw concurrency, others on rate limiting, and some on legacy compatibility. Let's break down how they handle real-world engineering challenges.

πŸš€ Execution Model: Concurrency vs Rate Limiting

The core difference lies in how they limit work. Some limit how many tasks run now (concurrency), while others limit how many tasks run per time unit (rate limiting).

async uses a worker pool model. You define how many workers process tasks from a queue.

// async: Concurrency with workers
const queue = async.queue((task, callback) => {
  process(task).then(() => callback());
}, 2); // 2 concurrent workers

queue.push({ name: 'task1' });
queue.push({ name: 'task2' });

bottleneck uses a token bucket or fixed window model. It focuses on spacing out executions.

// bottleneck: Rate limiting
const limiter = new Bottleneck({ minTime: 100 }); // 100ms between tasks

limiter.schedule(() => apiCall());
limiter.schedule(() => apiCall());

p-queue uses a concurrency counter. It runs tasks immediately until the limit is hit, then queues the rest.

// p-queue: Concurrency limit
const queue = new PQueue({ concurrency: 2 });

queue.add(() => fetch('/api/1'));
queue.add(() => fetch('/api/2'));

promise-queue is a strict FIFO (First-In-First-Out) queue. It usually runs one task at a time by default.

// promise-queue: Sequential FIFO
const queue = new PromiseQueue(1, Infinity); // 1 concurrent, infinite queued

queue.add(() => Promise.resolve(1));
queue.add(() => Promise.resolve(2));

queue-promise functions similarly to p-queue but often with a simpler API surface focused on Promise chaining.

// queue-promise: Simple Promise queue
const queue = new QueuePromise();

queue.add(() => processItem(1));
queue.add(() => processItem(2));

πŸŽ›οΈ Configuration & Control: Priority and Pausing

Real-world apps need to pause queues during errors or prioritize urgent tasks. The level of control varies significantly.

async allows priority levels (0 is highest). You can also drain the queue.

// async: Priority support
const queue = async.priorityQueue((task, callback) => {
  // process
}, 2);

queue.push({ name: 'urgent' }, 0); // Priority 0
queue.push({ name: 'normal' }, 1); // Priority 1

bottleneck offers the most control. You can update settings on the fly and listen to state events.

// bottleneck: Dynamic settings
limiter.updateSettings({ minTime: 500 }); // Slow down dynamically

limiter.on('error', (err) => console.error(err));

p-queue supports priority and pausing out of the box with a clean API.

// p-queue: Pause and Priority
queue.add(() => task(), { priority: 1 });
queue.pause(); // Stop processing
queue.start(); // Resume processing

promise-queue has minimal configuration. It is designed to be simple, so advanced controls are limited or require manual implementation.

// promise-queue: Basic usage
// No built-in priority or pause in standard API
queue.add(() => task());

queue-promise typically offers basic add/onEmpty functionality but lacks advanced priority systems found in p-queue.

// queue-promise: Event listeners
queue.on('empty', () => console.log('Done'));
queue.add(() => task());

⚠️ Error Handling: Fail Fast vs Keep Going

How a queue reacts to a failed task determines if your whole pipeline stalls.

async passes errors to a callback. If you don't handle it, the worker stops processing that task but the queue continues.

// async: Error callback
const queue = async.queue((task, callback) => {
  process(task)
    .then(() => callback())
    .catch((err) => callback(err)); // Pass error to callback
});

queue.error((err, task) => console.error('Task failed', err));

bottleneck propagates rejections from the scheduled function. It has built-in retry logic options.

// bottleneck: Rejection handling
limiter.schedule(() => riskyOperation())
  .catch((err) => console.error('Failed', err));

p-queue rejects the promise returned by add(). The queue itself keeps processing other tasks unless you configure it otherwise.

// p-queue: Individual promise rejection
queue.add(() => fetch('/api'))
  .catch((err) => console.error('Task failed', err));

promise-queue rejects the promise returned by add(). Being older, it lacks some of the granular error events of newer libraries.

// promise-queue: Standard rejection
queue.add(() => task())
  .catch((err) => console.error(err));

queue-promise behaves similarly to p-queue, rejecting the specific task promise without halting the entire queue.

// queue-promise: Task isolation
queue.add(() => task())
  .catch((err) => console.error(err));

🌐 Real-World Scenarios

Scenario 1: Respecting API Rate Limits

You are calling a third-party API that allows only 5 requests per second. Exceeding this gets you banned.

  • βœ… Best choice: bottleneck
  • Why? It is built specifically for rate limiting with token buckets and fixed windows, ensuring you never exceed the limit even under heavy load.
// bottleneck: Rate limiting
const limiter = new Bottleneck({
  maxConcurrent: 5,
  minTime: 200 // 5 requests per 1000ms
});

urls.forEach(url => limiter.schedule(() => fetch(url)));

Scenario 2: Image Processing in Browser

You need to resize 100 images but don't want to freeze the UI thread by running all at once.

  • βœ… Best choice: p-queue
  • Why? It limits concurrency easily, is Promise-native, and integrates well with modern frontend frameworks.
// p-queue: Concurrency control
const queue = new PQueue({ concurrency: 4 });

images.forEach(img => queue.add(() => resize(img)));
await queue.onIdle();

Scenario 3: Legacy Node.js Service

You are maintaining an older Node.js service that uses callbacks extensively.

  • βœ… Best choice: async
  • Why? It matches the existing code style and provides robust queue management without refactoring to Promises.
// async: Callback style
async.queue((data, cb) => {
  db.save(data, cb);
}, 2).push(records);

Scenario 4: Simple Sequential Logging

You need to write logs to a file one by one to avoid interleaving text.

  • βœ… Best choice: promise-queue or p-queue
  • Why? You just need FIFO order. promise-queue works if already present, but p-queue is safer for long-term maintenance.
// p-queue: Sequential (concurrency: 1)
const queue = new PQueue({ concurrency: 1 });
logs.forEach(log => queue.add(() => writeLog(log)));

πŸ“Œ Maintenance & Ecosystem Status

Not all packages are equal when it comes to long-term support.

  • async: Highly stable, widely used, but shows its age. Still maintained.
  • bottleneck: Actively maintained, essential for rate limiting.
  • p-queue: Actively maintained by Sindre Sorhus. The modern standard for queues.
  • promise-queue: ⚠️ Legacy. Last updates were years ago. Use only for maintenance of existing systems.
  • queue-promise: ⚠️ Low Activity. Less community support than p-queue. Evaluate risks before using.

πŸ“Š Summary Table

Featureasyncbottleneckp-queuepromise-queuequeue-promise
Primary FocusControl FlowRate LimitingConcurrencyFIFO QueueFIFO Queue
StyleCallback/PromisePromisePromisePromisePromise
Priorityβœ… Yesβœ… Yesβœ… Yes❌ No❌ Limited
Rate Limiting❌ Noβœ… Advanced⚠️ Basic (Interval)❌ No❌ No
Maintenanceβœ… Activeβœ… Activeβœ… Active⚠️ Stale⚠️ Low
Bundle SizeMediumMediumSmallSmallSmall

πŸ’‘ Final Recommendation

For new projects, reach for p-queue. It is the modern standard for Promise-based concurrency control. It is lightweight, well-maintained, and easy to use.

If you need to protect an API or manage strict rate limits, bottleneck is the only serious choice. Its features for token buckets and distributed coordination are unmatched.

Use async only if you are working in a legacy codebase that already depends on it. Its callback patterns add unnecessary complexity to modern Promise-based apps.

Avoid promise-queue and queue-promise for new development. They lack the active maintenance and feature depth of p-queue. Stick to the tools with strong community backing to ensure your architecture remains supportable in the future.

How to Choose: async vs bottleneck vs p-queue vs promise-queue vs queue-promise

  • async:

    Choose async if you are maintaining a legacy codebase that relies heavily on callbacks or need a comprehensive suite of control flow utilities beyond just queuing. It is battle-tested but shows its age with callback-first patterns, though it does support Promises now. Avoid it for new greenfield projects where modern Promise-native libraries are available.

  • bottleneck:

    Choose bottleneck if your primary concern is strict rate limiting (e.g., respecting API rate limits) or distributed coordination across multiple processes. It offers advanced features like token buckets, priority levels, and state events that other queues lack. It is the most robust option for protecting external services from being overwhelmed.

  • p-queue:

    Choose p-queue for most modern frontend and Node.js projects that need simple concurrency limiting or task prioritization. It is Promise-native, lightweight, and actively maintained by a trusted author. It strikes the best balance between features and simplicity for general-purpose task queuing without the overhead of complex rate limiting logic.

  • promise-queue:

    Choose promise-queue only if you are working on a legacy system that already depends on it or need a very minimal FIFO queue without extra features. It is largely unmaintained compared to modern alternatives. For new projects, prefer p-queue which offers better TypeScript support and active updates.

  • queue-promise:

    Choose queue-promise if you need a simple Promise-based queue and find p-queue too heavy, though this is rare. Like promise-queue, it sees less activity and community support than the top contenders. Evaluate it carefully for long-term maintenance risks before adopting it in production systems.

README for async

Async Logo

Github Actions CI status NPM version Coverage Status Join the chat at https://gitter.im/caolan/async jsDelivr Hits

Async is a utility module which provides straight-forward, powerful functions for working with asynchronous JavaScript. Although originally designed for use with Node.js and installable via npm i async, it can also be used directly in the browser. An ESM/MJS version is included in the main async package that should automatically be used with compatible bundlers such as Webpack and Rollup.

A pure ESM version of Async is available as async-es.

For Documentation, visit https://caolan.github.io/async/

For Async v1.5.x documentation, go HERE

// for use with Node-style callbacks...
var async = require("async");

var obj = {dev: "/dev.json", test: "/test.json", prod: "/prod.json"};
var configs = {};

async.forEachOf(obj, (value, key, callback) => {
    fs.readFile(__dirname + value, "utf8", (err, data) => {
        if (err) return callback(err);
        try {
            configs[key] = JSON.parse(data);
        } catch (e) {
            return callback(e);
        }
        callback();
    });
}, err => {
    if (err) console.error(err.message);
    // configs is now a map of JSON data
    doSomethingWith(configs);
});
var async = require("async");

// ...or ES2017 async functions
async.mapLimit(urls, 5, async function(url) {
    const response = await fetch(url)
    return response.body
}, (err, results) => {
    if (err) throw err
    // results is now an array of the response bodies
    console.log(results)
})