memory-cache vs axios-cache-adapter vs cache-manager vs cacheable-request vs lru-cache vs node-cache
Caching Strategies: In-Memory Stores vs HTTP Adapters
memory-cacheaxios-cache-adaptercache-managercacheable-requestlru-cachenode-cacheSimilar Packages:

Caching Strategies: In-Memory Stores vs HTTP Adapters

These libraries provide caching solutions for Node.js and frontend build environments, ranging from simple key-value stores to HTTP request interceptors. lru-cache, node-cache, and memory-cache offer direct in-memory storage with varying features for expiration and limits. cache-manager acts as an abstraction layer to swap storage backends without changing code. cacheable-request and axios-cache-adapter focus on caching HTTP responses transparently, reducing network load for API calls.

Npm Package Weekly Downloads Trend

3 Years

Github Stars Ranking

Stat Detail

Package
Downloads
Stars
Size
Issues
Publish
License
memory-cache742,6831,600-329 years agoBSD-2-Clause
axios-cache-adapter48,178722-595 years agoMIT
cache-manager01,95952.2 kB02 months agoMIT
cacheable-request01,95979.5 kB02 months agoMIT
lru-cache05,837841 kB421 days agoBlueOak-1.0.0
node-cache02,375-776 years agoMIT

Caching Strategies: In-Memory Stores vs HTTP Adapters

When building server-side rendered apps or Node.js backends, caching reduces load times and server stress. The packages listed here fall into two groups: raw storage engines (lru-cache, node-cache, memory-cache), abstraction layers (cache-manager), and HTTP-specific tools (cacheable-request, axios-cache-adapter). Let's break down how they handle data, expiration, and integration.

🗄️ Storing and Retrieving Data

The core job of any cache is to save and fetch values. The API style varies from synchronous to asynchronous, and from simple objects to specialized classes.

lru-cache uses a class-based approach with strict memory limits.

import { LRUCache } from 'lru-cache';
const cache = new LRUCache({ max: 500 });
cache.set('key', 'value');
const val = cache.get('key');

node-cache offers a simple synchronous API with event support.

const NodeCache = require('node-cache');
const cache = new NodeCache();
cache.set('key', 'value');
const val = cache.get('key');

memory-cache provides a very basic static interface.

const cache = require('memory-cache');
cache.put('key', 'value');
const val = cache.get('key');

cache-manager uses an asynchronous promise-based API.

import { caching } from 'cache-manager';
const cache = await caching('memory', { ttl: 5000 });
await cache.set('key', 'value');
const val = await cache.get('key');

cacheable-request wraps HTTP methods directly.

const CacheableRequest = require('cacheable-request');
const cr = new CacheableRequest(http.request);
const req = cr.get('http://example.com');

axios-cache-adapter integrates into the axios adapter chain.

import { setupCache } from 'axios-cache-adapter';
const cache = setupCache({ maxAge: 15 * 60 * 1000 });
const api = axios.create({ adapter: cache.adapter });

⏳ Handling Expiration and Limits

Controlling how long data lives is critical to prevent stale data or memory leaks. Some packages use Time-To-Live (TTL), while others use count limits.

lru-cache focuses on item count and optional TTL.

// Limits to 100 items, auto-deletes old ones
const cache = new LRUCache({ max: 100, ttl: 1000 * 60 });

node-cache allows global or per-key TTL.

// Global 60s TTL
const cache = new NodeCache({ stdTTL: 60 });
// Per-key override
cache.set('key', 'value', 120);

memory-cache requires TTL on every put operation.

// Must specify milliseconds every time
cache.put('key', 'value', 5000);

cache-manager defines TTL during setup or call.

// Setup TTL
const cache = await caching('memory', { ttl: 5000 });
// Or per set
await cache.set('key', 'value', 3000);

cacheable-request respects HTTP headers primarily.

// Uses Cache-Control headers from server
const req = cr.get('http://example.com');

axios-cache-adapter sets maxAge in configuration.

// Cache for 15 minutes
const cache = setupCache({ maxAge: 15 * 60 * 1000 });

🌐 HTTP Integration vs Raw Storage

Some tools cache any JavaScript value, while others specialize in network responses. Mixing these up can lead to architecture issues.

lru-cache, node-cache, and memory-cache store any data type. You must manually manage when to invalidate data after an API call.

// Manual HTTP caching logic
const data = await fetchAPI();
cache.set('api-result', data);

cache-manager stores any data but often pairs with HTTP logic.

// Wrap function to cache result
const user = await cache.wrap('user-1', fetchUser);

cacheable-request and axios-cache-adapter handle the network layer directly. They skip the request if a valid cache exists.

// cacheable-request: Automatic based on URL
const req = cr.get('http://api.com/data');

// axios-cache-adapter: Automatic based on config
const response = await api.get('/data');

⚠️ Maintenance and Risks

Choosing a library involves checking if it is still safe to use. Some packages are legacy or have newer successors.

memory-cache is simple but lacks recent updates. It does not support events or advanced TTL features found in node-cache. Use it only for quick scripts.

// Limited feature set
const val = cache.get('key');

axios-cache-adapter has seen less activity compared to axios-cache-interceptor. For new projects, evaluate if the interceptor fits better.

// Legacy adapter pattern
axios.create({ adapter: cache.adapter });

lru-cache, node-cache, and cache-manager are actively maintained. They receive security patches and feature updates regularly.

// Safe for production
const cache = new LRUCache({ max: 100 });

cacheable-request is stable and follows HTTP standards closely.

// Standard compliant
const cr = new CacheableRequest(http.request);

📊 Summary Table

PackageTypeAsyncTTL SupportMax ItemsHTTP Aware
lru-cacheStorageNoYesYesNo
node-cacheStorageNoYesNoNo
memory-cacheStorageNoYesNoNo
cache-managerAbstractionYesYesDependsNo
cacheable-requestHTTPYesHeader-basedNoYes
axios-cache-adapterHTTPYesConfig-basedNoYes

💡 Final Recommendation

For general server-side caching where you control the data, node-cache offers the best balance of features and stability. If you need strict memory limits to prevent crashes, lru-cache is the industry standard.

For HTTP requests, cacheable-request is ideal for native modules, while axios-cache-adapter works for axios users — though check if axios-cache-interceptor suits your needs better. Use cache-manager if you plan to switch to Redis later. Avoid memory-cache for critical production systems due to limited features.

How to Choose: memory-cache vs axios-cache-adapter vs cache-manager vs cacheable-request vs lru-cache vs node-cache

  • memory-cache:

    Choose this only for simple prototypes or scripts where you need basic key-value storage with minimal setup. For production systems, prefer node-cache for better event support and maintenance.

  • axios-cache-adapter:

    Choose this if you are already using axios and need a quick way to cache HTTP responses without changing your request logic. It bridges axios with storage engines like cache-manager. Note that newer projects might prefer axios-cache-interceptor for active maintenance.

  • cache-manager:

    Choose this if you need a unified interface that allows switching between memory, Redis, or file stores later. It is ideal for applications that might scale from single-server memory to distributed caching without rewriting core logic.

  • cacheable-request:

    Choose this if you are using native http or got modules and want transparent HTTP caching based on standard headers. It works well for low-level network tools where you need strict adherence to HTTP caching rules.

  • lru-cache:

    Choose this if you need a fast, dependency-free Least Recently Used cache with precise control over memory limits. It is the standard choice for internal data structures where you must prevent memory leaks.

  • node-cache:

    Choose this if you need a robust in-memory cache with event emitters, flexible TTL per key, and a stable API. It is the go-to for general-purpose server-side caching without external dependencies.

README for memory-cache

memory-cache Build Status

A simple in-memory cache for node.js

Installation

npm install memory-cache --save

Usage

var cache = require('memory-cache');

// now just use the cache

cache.put('foo', 'bar');
console.log(cache.get('foo'));

// that wasn't too interesting, here's the good part

cache.put('houdini', 'disappear', 100, function(key, value) {
    console.log(key + ' did ' + value);
}); // Time in ms

console.log('Houdini will now ' + cache.get('houdini'));

setTimeout(function() {
    console.log('Houdini is ' + cache.get('houdini'));
}, 200);


// create new cache instance
var newCache = new cache.Cache();

newCache.put('foo', 'newbaz');

setTimeout(function() {
  console.log('foo in old cache is ' + cache.get('foo'));
  console.log('foo in new cache is ' + newCache.get('foo'));
}, 200);

which should print

bar
Houdini will now disappear
houdini did disappear
Houdini is null
foo in old cache is baz
foo in new cache is newbaz

API

put = function(key, value, time, timeoutCallback)

  • Simply stores a value
  • If time isn't passed in, it is stored forever
  • Will actually remove the value in the specified time in ms (via setTimeout)
  • timeoutCallback is optional function fired after entry has expired with key and value passed (function(key, value) {})
  • Returns the cached value

get = function(key)

  • Retrieves a value for a given key
  • If value isn't cached, returns null

del = function(key)

  • Deletes a key, returns a boolean specifying whether or not the key was deleted

clear = function()

  • Deletes all keys

size = function()

  • Returns the current number of entries in the cache

memsize = function()

  • Returns the number of entries taking up space in the cache
  • Will usually == size() unless a setTimeout removal went wrong

debug = function(bool)

  • Turns on or off debugging

hits = function()

  • Returns the number of cache hits (only monitored in debug mode)

misses = function()

  • Returns the number of cache misses (only monitored in debug mode)

keys = function()

  • Returns all the cache keys

exportJson = function()

  • Returns a JSON string representing all the cache data
  • Any timeoutCallbacks will be ignored

importJson = function(json: string, options: { skipDuplicates: boolean })

  • Merges all the data from a previous call to export into the cache
  • Any existing entries before an import will remain in the cache
  • Any duplicate keys will be overwritten, unless skipDuplicates is true
  • Any entries that would have expired since being exported will expire upon being imported (but their callbacks will not be invoked)
  • Available options:
    • skipDuplicates: If true, any duplicate keys will be ignored when importing them. Defaults to false.
  • Returns the new size of the cache

Cache = function()

  • Cache constructor
  • note that require('cache') would return the default instance of Cache
  • while require('cache').Cache is the actual class

Note on Patches/Pull Requests

  • Fork the project.
  • Make your feature addition or bug fix.
  • Send me a pull request.