Node.js Rate Limiting Libraries Comparison
limiter vs express-rate-limit vs ratelimiter
1 Year
limiterexpress-rate-limitratelimiterSimilar Packages:
What's Node.js Rate Limiting Libraries?

Rate limiting libraries in Node.js are essential for controlling the amount of incoming requests to a server within a specified timeframe. They help prevent abuse and ensure fair usage of resources by limiting the number of requests a user can make. These libraries provide various strategies for implementing rate limiting, such as in-memory storage, Redis integration, and more. By using these libraries, developers can enhance the security and performance of their applications while providing a better user experience by preventing server overload.

NPM Package Downloads Trend
Github Stars Ranking
Stat Detail
Package
Downloads
Stars
Size
Issues
Publish
License
limiter6,506,1851,510-204 years agoMIT
express-rate-limit1,375,8852,922117 kB52 months agoMIT
ratelimiter423,875719-105 years agoMIT
Feature Comparison: limiter vs express-rate-limit vs ratelimiter

Integration

  • limiter:

    Limiter is a standalone library that can be integrated into any Node.js application, not just those using Express. It provides a flexible API that allows you to implement rate limiting in various contexts, making it suitable for microservices or non-Express applications.

  • express-rate-limit:

    Express-rate-limit is specifically designed for Express.js applications, providing middleware that can be easily integrated into your route handlers. It allows you to apply rate limiting rules directly within your Express routes, making it very convenient for Express developers.

  • ratelimiter:

    Ratelimiter is also a standalone library, but it focuses on high-performance scenarios. It can be integrated into any Node.js application and is optimized for speed, making it a good choice for applications that require rapid request handling.

Configuration Flexibility

  • limiter:

    Limiter provides extensive configuration options, allowing you to define complex rate limiting rules, such as different limits for different routes or user roles. This flexibility makes it suitable for applications with diverse rate limiting needs.

  • express-rate-limit:

    Express-rate-limit offers a variety of configuration options, allowing you to set limits based on different criteria such as IP address, request path, and more. You can also customize the response sent to clients when they exceed the limit, providing a tailored user experience.

  • ratelimiter:

    Ratelimiter offers a more straightforward configuration model with a focus on performance. While it may not have as many options as limiter, it provides essential features that cover most use cases effectively.

Performance

  • limiter:

    Limiter is designed for performance and can handle high request volumes efficiently. It supports various storage backends, including Redis, which allows for distributed rate limiting across multiple instances of your application.

  • express-rate-limit:

    While express-rate-limit is efficient for most applications, it stores rate limit data in memory by default, which may not be suitable for distributed systems. However, it can be configured to use external stores like Redis for better scalability.

  • ratelimiter:

    Ratelimiter is optimized for high throughput and low latency, making it an excellent choice for performance-critical applications. It is lightweight and can handle a large number of requests without significant overhead.

Use Cases

  • limiter:

    Ideal for applications that require complex rate limiting strategies or need to manage rate limits across different contexts. Suitable for microservices or APIs with varying rate limiting needs.

  • express-rate-limit:

    Best suited for Express.js applications where you need a quick and easy way to implement rate limiting without additional dependencies. Ideal for projects that require basic rate limiting features.

  • ratelimiter:

    Best for high-traffic applications that demand efficient request handling. It is suitable for scenarios where performance is critical, and you need a lightweight solution.

Community and Support

  • limiter:

    Limiter has a smaller community compared to express-rate-limit but is still well-documented. It may not have as many resources available, but it is actively maintained.

  • express-rate-limit:

    Express-rate-limit has a large user base and is well-documented, making it easy to find support and examples. The community actively contributes to its development, ensuring it stays up-to-date with best practices.

  • ratelimiter:

    Ratelimiter has a growing community and is gaining popularity for its performance features. While documentation is available, it may not be as extensive as express-rate-limit.

How to Choose: limiter vs express-rate-limit vs ratelimiter
  • limiter:

    Choose limiter if you require a more flexible and customizable rate limiting solution that can work outside of Express. It allows you to define complex rate limiting strategies and is suitable for applications that need to manage rate limits across different contexts.

  • express-rate-limit:

    Choose express-rate-limit if you are using Express.js and need a straightforward solution for rate limiting with built-in middleware support. It is highly configurable and integrates seamlessly with Express applications.

  • ratelimiter:

    Choose ratelimiter if you need a lightweight and efficient rate limiting library that focuses on performance. It is designed for high throughput and can be used in various Node.js environments, making it ideal for applications with high traffic.

README for limiter

limiter

Build Status NPM Downloads

Provides a generic rate limiter for the web and node.js. Useful for API clients, web crawling, or other tasks that need to be throttled. Two classes are exposed, RateLimiter and TokenBucket. TokenBucket provides a lower level interface to rate limiting with a configurable burst rate and drip rate. RateLimiter sits on top of the token bucket and adds a restriction on the maximum number of tokens that can be removed each interval to comply with common API restrictions such as "150 requests per hour maximum".

Installation

yarn install limiter

Usage

A simple example allowing 150 requests per hour:

import { RateLimiter } from "limiter";

// Allow 150 requests per hour (the Twitter search limit). Also understands
// 'second', 'minute', 'day', or a number of milliseconds
const limiter = new RateLimiter({ tokensPerInterval: 150, interval: "hour" });

async function sendRequest() {
  // This call will throw if we request more than the maximum number of requests
  // that were set in the constructor
  // remainingRequests tells us how many additional requests could be sent
  // right this moment
  const remainingRequests = await limiter.removeTokens(1);
  callMyRequestSendingFunction(...);
}

Another example allowing one message to be sent every 250ms:

import { RateLimiter } from "limiter";

const limiter = new RateLimiter({ tokensPerInterval: 1, interval: 250 });

async function sendMessage() {
  const remainingMessages = await limiter.removeTokens(1);
  callMyMessageSendingFunction(...);
}

The default behaviour is to wait for the duration of the rate limiting that's currently in effect before the promise is resolved, but if you pass in "fireImmediately": true, the promise will be resolved immediately with remainingRequests set to -1:

import { RateLimiter } from "limiter";

const limiter = new RateLimiter({
  tokensPerInterval: 150,
  interval: "hour",
  fireImmediately: true
});

async function requestHandler(request, response) {
  // Immediately send 429 header to client when rate limiting is in effect
  const remainingRequests = await limiter.removeTokens(1);
  if (remainingRequests < 0) {
    response.writeHead(429, {'Content-Type': 'text/plain;charset=UTF-8'});
    response.end('429 Too Many Requests - your IP is being rate limited');
  } else {
    callMyMessageSendingFunction(...);
  }
}

A synchronous method, tryRemoveTokens(), is available in both RateLimiter and TokenBucket. This will return immediately with a boolean value indicating if the token removal was successful.

import { RateLimiter } from "limiter";

const limiter = new RateLimiter({ tokensPerInterval: 10, interval: "second" });

if (limiter.tryRemoveTokens(5))
  console.log('Tokens removed');
else
  console.log('No tokens removed');

To get the number of remaining tokens outside the removeTokens promise, simply use the getTokensRemaining method.

import { RateLimiter } from "limiter";

const limiter = new RateLimiter({ tokensPerInterval: 1, interval: 250 });

// Prints 1 since we did not remove a token and our number of tokens per
// interval is 1
console.log(limiter.getTokensRemaining());

Using the token bucket directly to throttle at the byte level:

import { TokenBucket } from "limiter";

const BURST_RATE = 1024 * 1024 * 150; // 150KB/sec burst rate
const FILL_RATE = 1024 * 1024 * 50; // 50KB/sec sustained rate

// We could also pass a parent token bucket in to create a hierarchical token
// bucket
// bucketSize, tokensPerInterval, interval
const bucket = new TokenBucket({
  bucketSize: BURST_RATE,
  tokensPerInterval: FILL_RATE,
  interval: "second"
});

async function handleData(myData) {
  await bucket.removeTokens(myData.byteLength);
  sendMyData(myData);
}

Additional Notes

Both the token bucket and rate limiter should be used with a message queue or some way of preventing multiple simultaneous calls to removeTokens(). Otherwise, earlier messages may get held up for long periods of time if more recent messages are continually draining the token bucket. This can lead to out of order messages or the appearance of "lost" messages under heavy load.

License

MIT License