A cache object that deletes the least-recently-used items.
The lru-cache package implements a Least Recently Used (LRU) cache data structure for Node.js applications. It automatically evicts the least recently accessed items when the cache reaches capacity, making it ideal for storing frequently accessed data in memory without unbounded growth. With over 300 million weekly downloads, it's one of the most widely used caching solutions in the Node.js ecosystem.
The package uses a doubly-linked list combined with a Map to achieve O(1) time complexity for get, set, and delete operations. This efficient implementation means you can cache thousands of items with minimal performance overhead. The API is straightforward: configure maximum size, optional TTL (time-to-live), and size calculation functions, then use standard get/set/delete methods.
lru-cache is designed for single-process applications where you need fast, in-memory caching. It supports both count-based limits (maximum number of items) and size-based limits (maximum total size with custom calculation). Time-based expiration can be set globally or per-item, with options to serve stale data or refresh TTL on access.
This package is production-ready and stable, used extensively in build tools, web frameworks, and API servers. It's not suitable for distributed systems or multi-process coordination—for those scenarios, you'd need Redis or similar. However, for in-process caching of expensive computations, database query results, or API responses, lru-cache provides an optimal balance of simplicity and performance.
import { LRUCache } from 'lru-cache';
const cache = new LRUCache({
max: 500,
ttl: 1000 * 60 * 5,
allowStale: false,
updateAgeOnGet: false,
updateAgeOnHas: false,
});
async function fetchUser(userId) {
const cached = cache.get(userId);
if (cached) {
console.log('Cache hit for user:', userId);
return cached;
}
console.log('Cache miss, fetching user:', userId);
const user = await fetch(`https://api.example.com/users/${userId}`).then(r => r.json());
cache.set(userId, user, {
ttl: 1000 * 60 * 10
});
return user;
}
const sizeCache = new LRUCache({
max: 100,
maxSize: 5000,
sizeCalculation: (value) => {
return JSON.stringify(value).length;
},
});
sizeCache.set('large-object', { data: new Array(1000).fill('x') });
console.log('Cache size:', sizeCache.size);
console.log('Cache calculated size:', sizeCache.calculatedSize);
await fetchUser(123);
await fetchUser(123);API response caching: Cache expensive API calls to third-party services with TTL expiration. For example, cache weather data for 5 minutes to reduce external API hits while keeping data reasonably fresh.
Database query results: Store frequently accessed database queries in memory. A product catalog query that runs thousands of times per minute can be cached to reduce database load, with automatic eviction of rarely accessed products.
Computed values: Cache results of expensive calculations like image transformations, markdown-to-HTML conversions, or complex data aggregations. The cache automatically manages memory by evicting old results.
Session or token storage: Store user sessions or JWT tokens in memory for fast validation, with TTL-based expiration matching token lifetimes. Suitable for single-server deployments or development environments.
Build tool optimization: Cache compiled assets, transpiled code, or resolved module paths during development. Many bundlers and transpilers use lru-cache internally to speed up incremental builds by avoiding redundant processing.
npm install lru-cachepnpm add lru-cachebun add lru-cache