apicache vs lru-cache vs memory-cache vs node-cache
Node.js Caching Libraries
apicachelru-cachememory-cachenode-cacheSimilar Packages:

Node.js Caching Libraries

Caching libraries in Node.js are used to temporarily store data in memory to improve application performance by reducing the need to repeatedly fetch data from slower storage systems, such as databases or external APIs. These libraries provide mechanisms to manage cached data, including expiration policies, eviction strategies, and memory management, allowing developers to optimize their applications for speed and efficiency.

Npm Package Weekly Downloads Trend

3 Years

Github Stars Ranking

Stat Detail

Package
Downloads
Stars
Size
Issues
Publish
License
apicache01,248-635 years agoMIT
lru-cache05,8681.78 MB113 days agoBlueOak-1.0.0
memory-cache01,601-329 years agoBSD-2-Clause
node-cache02,374-776 years agoMIT

Feature Comparison: apicache vs lru-cache vs memory-cache vs node-cache

Eviction Policy

  • apicache:

    Apicache does not implement a traditional eviction policy since it is primarily focused on caching HTTP responses. It allows for manual cache invalidation and expiration settings but does not automatically remove items based on usage patterns.

  • lru-cache:

    LRU Cache employs a least-recently-used eviction policy, meaning that it will automatically remove the least recently accessed items from the cache when it reaches its size limit. This ensures that frequently accessed data remains available while older, less-used data is discarded.

  • memory-cache:

    Memory Cache does not have a built-in eviction policy. It simply stores key-value pairs in memory until the process is terminated or the cache is manually cleared. This makes it less suitable for applications with limited memory resources.

  • node-cache:

    Node-Cache supports TTL (time-to-live) for cache entries, allowing developers to set expiration times for individual cache items. Once the TTL expires, the item is automatically removed from the cache, effectively managing memory usage.

Complexity

  • apicache:

    Apicache is designed to be simple and easy to use, with minimal configuration required. It is particularly user-friendly for developers who want to implement caching for HTTP requests without delving into complex caching strategies.

  • lru-cache:

    LRU Cache is straightforward but requires some understanding of the least-recently-used strategy. It is easy to implement but may require additional logic to manage cache size and handle evictions effectively.

  • memory-cache:

    Memory Cache is the simplest of the options, providing a basic key-value store with no additional features. It is very easy to set up and use, making it ideal for quick caching needs without overhead.

  • node-cache:

    Node-Cache offers a more complex API with various features such as TTL and cache management methods. While it provides more control, it may require a steeper learning curve compared to simpler options.

Use Case

  • apicache:

    Apicache is best suited for caching API responses in web applications, especially when dealing with high-frequency requests. It is ideal for scenarios where reducing response times for HTTP requests is critical.

  • lru-cache:

    LRU Cache is perfect for caching data that is frequently accessed but has a limited memory footprint. It is commonly used in applications where memory management is crucial, such as in-memory databases or caching user sessions.

  • memory-cache:

    Memory Cache is suitable for lightweight applications that require quick access to temporary data without the need for advanced features. It is often used for caching configuration settings or transient data.

  • node-cache:

    Node-Cache is ideal for applications that require a robust caching mechanism with expiration control. It is commonly used in scenarios where data needs to be cached for a specific duration, such as caching database query results.

Performance

  • apicache:

    Apicache can significantly improve performance for HTTP requests by reducing latency through caching. However, its performance is heavily dependent on the configuration of cache expiration and invalidation strategies.

  • lru-cache:

    LRU Cache provides excellent performance for read-heavy workloads, as it keeps frequently accessed data in memory. The eviction policy ensures that the cache remains efficient, but performance may degrade if the cache size is not managed properly.

  • memory-cache:

    Memory Cache offers fast access to cached data due to its in-memory nature. However, it lacks eviction policies, which can lead to memory exhaustion if not monitored, potentially impacting overall application performance.

  • node-cache:

    Node-Cache delivers good performance with its TTL feature, allowing for efficient memory usage and quick access to cached data. The ability to set expiration times helps maintain performance over time by preventing stale data.

Scalability

  • apicache:

    Apicache is not inherently designed for distributed caching, which can limit its scalability in large applications. It is best suited for single-instance applications or when combined with other caching strategies for scalability.

  • lru-cache:

    LRU Cache is primarily an in-memory solution and may not scale well across multiple instances without additional mechanisms for synchronization. It is best used in scenarios where a single instance can handle the load.

  • memory-cache:

    Memory Cache is not designed for scalability, as it only stores data in the local memory of the Node.js process. It is suitable for small applications but may not perform well in larger, distributed environments.

  • node-cache:

    Node-Cache is also an in-memory cache and does not support distributed caching out of the box. It is best used in single-instance applications, but can be combined with other solutions for larger applications.

How to Choose: apicache vs lru-cache vs memory-cache vs node-cache

  • apicache:

    Choose Apicache if you need a simple and effective caching solution specifically for HTTP requests. It is particularly useful for caching API responses and offers built-in support for cache expiration and invalidation, making it ideal for applications with frequent API calls.

  • lru-cache:

    Opt for LRU Cache when you require a cache with a least-recently-used eviction strategy. This package is well-suited for scenarios where memory usage is a concern, as it automatically removes the least recently accessed items when the cache reaches its limit, ensuring efficient memory management.

  • memory-cache:

    Select Memory Cache for a straightforward in-memory caching solution that is easy to implement and use. It is best for applications that need a simple key-value store without complex features, making it suitable for lightweight caching needs.

  • node-cache:

    Use Node-Cache if you need a feature-rich caching solution that supports TTL (time-to-live) for cache entries. It provides a more comprehensive API for managing cached data and is ideal for applications that require more control over cache behavior.

README for apicache

A simple API response caching middleware for Express/Node using plain-english durations.

Supports Redis or built-in memory engine with auto-clearing.

npm version node version support Build Status via Travis CI Coverage Status NPM downloads

Why?

Because route-caching of simple data/responses should ALSO be simple.

Usage

To use, simply inject the middleware (example: apicache.middleware('5 minutes', [optionalMiddlewareToggle])) into your routes. Everything else is automagic.

Cache a route

import express from 'express'
import apicache from 'apicache'

let app = express()
let cache = apicache.middleware

app.get('/api/collection/:id?', cache('5 minutes'), (req, res) => {
  // do some work... this will only occur once per 5 minutes
  res.json({ foo: 'bar' })
})

Cache all routes

let cache = apicache.middleware

app.use(cache('5 minutes'))

app.get('/will-be-cached', (req, res) => {
  res.json({ success: true })
})

Use with Redis

import express from 'express'
import apicache from 'apicache'
import redis from 'redis'

let app = express()

// if redisClient option is defined, apicache will use redis client
// instead of built-in memory store
let cacheWithRedis = apicache.options({ redisClient: redis.createClient() }).middleware

app.get('/will-be-cached', cacheWithRedis('5 minutes'), (req, res) => {
  res.json({ success: true })
})

Cache grouping and manual controls

import apicache from 'apicache'
let cache = apicache.middleware

app.use(cache('5 minutes'))

// routes are automatically added to index, but may be further added
// to groups for quick deleting of collections
app.get('/api/:collection/:item?', (req, res) => {
  req.apicacheGroup = req.params.collection
  res.json({ success: true })
})

// add route to display cache performance (courtesy of @killdash9)
app.get('/api/cache/performance', (req, res) => {
  res.json(apicache.getPerformance())
})

// add route to display cache index
app.get('/api/cache/index', (req, res) => {
  res.json(apicache.getIndex())
})

// add route to manually clear target/group
app.get('/api/cache/clear/:target?', (req, res) => {
  res.json(apicache.clear(req.params.target))
})

/*

GET /api/foo/bar --> caches entry at /api/foo/bar and adds a group called 'foo' to index
GET /api/cache/index --> displays index
GET /api/cache/clear/foo --> clears all cached entries for 'foo' group/collection

*/

Use with middleware toggle for fine control

// higher-order function returns false for responses of other status codes (e.g. 403, 404, 500, etc)
const onlyStatus200 = (req, res) => res.statusCode === 200

const cacheSuccesses = cache('5 minutes', onlyStatus200)

app.get('/api/missing', cacheSuccesses, (req, res) => {
  res.status(404).json({ results: 'will not be cached' })
})

app.get('/api/found', cacheSuccesses, (req, res) => {
  res.json({ results: 'will be cached' })
})

Prevent cache-control header "max-age" from automatically being set to expiration age

let cache = apicache.options({
  headers: {
    'cache-control': 'no-cache',
  },
}).middleware

let cache5min = cache('5 minute') // continue to use normally

API

  • apicache.options([globalOptions]) - getter/setter for global options. If used as a setter, this function is chainable, allowing you to do things such as... say... return the middleware.
  • apicache.middleware([duration], [toggleMiddleware], [localOptions]) - the actual middleware that will be used in your routes. duration is in the following format "[length][unit]", as in "10 minutes" or "1 day". A second param is a middleware toggle function, accepting request and response params, and must return truthy to enable cache for the request. Third param is the options that will override global ones and affect this middleware only.
  • middleware.options([localOptions]) - getter/setter for middleware-specific options that will override global ones.
  • apicache.getPerformance() - returns current cache performance (cache hit rate)
  • apicache.getIndex() - returns current cache index [of keys]
  • apicache.clear([target]) - clears cache target (key or group), or entire cache if no value passed, returns new index.
  • apicache.newInstance([options]) - used to create a new ApiCache instance (by default, simply requiring this library shares a common instance)
  • apicache.clone() - used to create a new ApiCache instance with the same options as the current one

Available Options (first value is default)

{
  debug:            false|true,     // if true, enables console output
  defaultDuration:  '1 hour',       // should be either a number (in ms) or a string, defaults to 1 hour
  enabled:          true|false,     // if false, turns off caching globally (useful on dev)
  redisClient:      client,         // if provided, uses the [node-redis](https://github.com/NodeRedis/node_redis) client instead of [memory-cache](https://github.com/ptarjan/node-cache)
  appendKey:        fn(req, res),   // appendKey takes the req/res objects and returns a custom value to extend the cache key
  headerBlacklist:  [],             // list of headers that should never be cached
  statusCodes: {
    exclude:        [],             // list status codes to specifically exclude (e.g. [404, 403] cache all responses unless they had a 404 or 403 status)
    include:        [],             // list status codes to require (e.g. [200] caches ONLY responses with a success/200 code)
  },
  trackPerformance: false,          // enable/disable performance tracking... WARNING: super cool feature, but may cause memory overhead issues
  headers: {
    // 'cache-control':  'no-cache' // example of header overwrite
  },
  respectCacheControl: false|true   // If true, 'Cache-Control: no-cache' in the request header will bypass the cache.
}
*Optional: Typescript Types (courtesy of @danielsogl)
$ npm install -D @types/apicache

Custom Cache Keys

Sometimes you need custom keys (e.g. save routes per-session, or per method). We've made it easy!

Note: All req/res attributes used in the generation of the key must have been set previously (upstream). The entire route logic block is skipped on future cache hits so it can't rely on those params.

apicache.options({
  appendKey: (req, res) => req.method + res.session.id,
})

Cache Key Groups

Oftentimes it benefits us to group cache entries, for example, by collection (in an API). This would enable us to clear all cached "post" requests if we updated something in the "post" collection for instance. Adding a simple req.apicacheGroup = [somevalue]; to your route enables this. See example below:

var apicache = require('apicache')
var cache = apicache.middleware

// GET collection/id
app.get('/api/:collection/:id?', cache('1 hour'), function(req, res, next) {
  req.apicacheGroup = req.params.collection
  // do some work
  res.send({ foo: 'bar' })
})

// POST collection/id
app.post('/api/:collection/:id?', function(req, res, next) {
  // update model
  apicache.clear(req.params.collection)
  res.send('added a new item, so the cache has been cleared')
})

Additionally, you could add manual cache control to the previous project with routes such as these:

// GET apicache index (for the curious)
app.get('/api/cache/index', function(req, res, next) {
  res.send(apicache.getIndex())
})

// GET apicache index (for the curious)
app.get('/api/cache/clear/:key?', function(req, res, next) {
  res.send(200, apicache.clear(req.params.key || req.query.key))
})

Debugging/Console Out

Using Node environment variables (plays nicely with the hugely popular debug module)

$ export DEBUG=apicache
$ export DEBUG=apicache,othermoduleThatDebugModuleWillPickUp,etc

By setting internal option

import apicache from 'apicache'

apicache.options({ debug: true })

Client-Side Bypass

When sharing GET routes between admin and public sites, you'll likely want the routes to be cached from your public client, but NOT cached when from the admin client. This is achieved by sending a "x-apicache-bypass": true header along with the requst from the admin. The presence of this header flag will bypass the cache, ensuring you aren't looking at stale data.

Contributors

Special thanks to all those that use this library and report issues, but especially to the following active users that have helped add to the core functionality!

  • @Chocobozzz - the savior of getting this to pass all the Node 14/15 tests again... thanks for everyone's patience!!!
  • @killdash9 - restify support, performance/stats system, and too much else at this point to list
  • @svozza - added restify tests, test suite refactor, and fixed header issue with restify. Node v7 + Restify v5 conflict resolution, etag/if-none-match support, etcetc, etc. Triple thanks!!!
  • @andredigenova - Added header blacklist as options, correction to caching checks
  • @peteboere - Node v7 headers update
  • @rutgernation - JSONP support
  • @enricsangra - added x-apicache-force-fetch header
  • @tskillian - custom appendKey path support
  • @agolden - Content-Encoding preservation (for gzip, etc)
  • @davidyang - express 4+ compatibility
  • @nmors - redis support
  • @maytis, @ashwinnaidu - redis expiration
  • @ubergesundheit - Corrected buffer accumulation using res.write with Buffers
  • @danielsogl - Keeping dev deps up to date, Typescript Types
  • @vectart - Added middleware local options support
  • @davebaol - Added string support to defaultDuration option (previously just numeric ms)
  • @Rauttis - Added ioredis support
  • @fernandolguevara - Added opt-out for performance tracking, great emergency fix, thank you!!

Bugfixes, tweaks, documentation, etc.

  • @Amhri, @Webcascade, @conmarap, @cjfurelid, @scambier, @lukechilds, @Red-Lv, @gesposito, @viebel, @RowanMeara, @GoingFast, @luin, @keithws, @daveross, @apascal, @guybrush

Changelog

  • v1.6.0 - added respectCacheControl option flag to force honoring no-cache (thanks @NaridaL!)
  • v1.5.4 - up to Node v15 support, HUGE thanks to @Chocobozzz and all the folks on the PR thread! <3
  • v1.5.3 - multiple fixes: Redis should be connected before using (thanks @guybrush)
  • v1.5.2 - multiple fixes: Buffer deprecation and _headers deprecation, { trackPerformance: false } by default per discussion (sorry semver...)
  • v1.5.1 - adds { trackPerformance } option to enable/disable performance tracking (thanks @fernandolguevara)
  • v1.5.0 - exposes apicache.getPerformance() for per-route cache metrics (@killdash9 continues to deliver)
  • v1.4.0 - cache-control header now auto-decrements in cached responses (thanks again, @killdash9)
  • v1.3.0 - [securityfix] apicache headers no longer embedded in cached responses when NODE_ENV === 'production' (thanks for feedback @satya-jugran, @smddzcy, @adamelliotfields). Updated deps, now requiring Node v6.00+.
  • v1.2.6 - middlewareToggle() now prevents response block on cache hit + falsy toggle (thanks @apascal)
  • v1.2.5 - uses native Node setHeader() rather than express.js header() (thanks @keithws and @daveross)
  • v1.2.4 - force content type to Buffer, using old and new Buffer creation syntax
  • v1.2.3 - add etag to if-none-match 304 support (thanks for the test/issue @svozza)
  • v1.2.2 - bugfix: ioredis.expire params (thanks @GoingFast and @luin)
  • v1.2.1 - Updated deps
  • v1.2.0 - Supports ioredis (thanks @Rauttis)
  • v1.1.1 - bugfixes in expiration timeout clearing and content header preservation under compression (thanks @RowanMeara and @samimakicc).
  • v1.1.0 - added the much-requested feature of a custom appendKey function (previously only took a path to a single request attribute). Now takes (request, response) objects and returns some value to be appended to the cache key.
  • v1.0.0 - stamping v0.11.2 into official production version, will now begin developing on branch v2.x (redesign)
  • v0.11.2 - dev-deps update, courtesy of @danielsogl
  • v0.11.1 - correction to status code caching, and max-age headers are no longer sent when not cached. middlewareToggle now works as intended with example of statusCode checking (checks during shouldCacheResponse cycle)
  • v0.11.0 - Added string support to defaultDuration option, previously just numeric ms - thanks @davebaol
  • v0.10.0 - added ability to blacklist headers (prevents caching) via options.headersBlacklist (thanks @andredigenova)
  • v0.9.1 - added eslint in prep for v1.x branch, minor ES6 to ES5 in master branch tests
  • v0.9.0 - corrected Node v7.7 & v8 conflicts with restify (huge thanks to @svozza for chasing this down and fixing upstream in restify itself). Added coveralls. Added middleware.localOptions support (thanks @vectart). Added ability to overwrite/embed headers (e.g. "cache-control": "no-cache") through options.
  • v0.8.8 - corrected to use node v7+ headers (thanks @peteboere)
  • v0.8.6, v0.8.7 - README update
  • v0.8.5 - dev dependencies update (thanks @danielsogl)
  • v0.8.4 - corrected buffer accumulation, with test support (thanks @ubergesundheit)
  • v0.8.3 - added tests for x-apicache-bypass and x-apicache-force-fetch (legacy) and fixed a bug in the latter (thanks @Red-Lv)
  • v0.8.2 - test suite and mock API refactor (thanks @svozza)
  • v0.8.1 - fixed restify support and added appropriate tests (thanks @svozza)
  • v0.8.0 - modifies response accumulation (thanks @killdash9) to support res.write + res.end accumulation, allowing integration with restify. Adds gzip support (Node v4.3.2+ now required) and tests.
  • v0.7.0 - internally sets cache-control/max-age headers of response object
  • v0.6.0 - removed final dependency (debug) and updated README
  • v0.5.0 - updated internals to use res.end instead of res.send/res.json/res.jsonp, allowing for any response type, adds redis tests
  • v0.4.0 - dropped lodash and memory-cache external dependencies, and bumped node version requirements to 4.0.0+ to allow Object.assign native support