Simple “Least Recently Used” (LRU) cache
quick-lru is a minimal LRU (Least Recently Used) cache implementation that automatically evicts the least recently accessed items when a size limit is reached. Built on JavaScript's native Map, it provides O(1) lookups and updates while maintaining insertion order, making it significantly faster than alternatives for straightforward caching scenarios.
The package solves a common problem in JavaScript applications: preventing unbounded memory growth when caching computed values, API responses, or expensive function results. Unlike feature-rich alternatives like lru-cache, quick-lru deliberately omits time-to-live (TTL), byte-based limits, and event hooks to maximize performance and minimize bundle size—clocking in at roughly 1KB versus 20KB+ for full-featured options.
With over 30 million weekly downloads, quick-lru is widely adopted in performance-critical libraries and build tools where caching overhead must be negligible. Maintained by Sindre Sorhus, it follows modern JavaScript standards with full TypeScript support and works in Node.js 18+ and modern browsers. The API surface is intentionally small: set, get, has, delete, clear, and a size property—exactly what most developers need without configuration complexity.
Developers choose quick-lru when they need predictable memory usage with fixed entry limits and don't require expiration policies or cache statistics. Its Map-based foundation supports any key type (objects, symbols, etc.), unlike older object-based caches limited to string keys, making it suitable for modern codebases that leverage ES6+ features.
import QuickLRU from 'quick-lru';
// Cache expensive database queries with max 500 entries
const queryCache = new QuickLRU({ maxSize: 500 });
async function fetchUser(userId) {
if (queryCache.has(userId)) {
return queryCache.get(userId);
}
const user = await db.query('SELECT * FROM users WHERE id = ?', [userId]);
queryCache.set(userId, user);
return user;
}
// Memoize function results
const fibCache = new QuickLRU({ maxSize: 100 });
function fibonacci(n) {
if (n <= 1) return n;
if (fibCache.has(n)) return fibCache.get(n);
const result = fibonacci(n - 1) + fibonacci(n - 2);
fibCache.set(n, result);
return result;
}
console.log(fibonacci(50)); // Instant after first calculation
console.log(fibCache.size); // 49 (capped at maxSize)
// Manual cache management
queryCache.delete('user:123');
queryCache.clear();
console.log(queryCache.size); // 0Function memoization: Cache expensive computation results (e.g., complex regex parsing, recursive calculations) where you want to limit memory to the N most recent calls rather than growing indefinitely.
API response caching: Store HTTP responses or database query results in server-side applications, keeping only the most frequently accessed data in memory while automatically discarding stale entries.
Build tool optimization: Cache compiled templates, transformed AST nodes, or resolved module paths in bundlers and transpilers where nanoseconds matter and memory budgets are strict.
Token/session storage: Maintain recent authentication tokens or user sessions in memory with automatic eviction of inactive users, avoiding manual cleanup logic.
Deduplication buffers: Track recently processed IDs or hashes to prevent duplicate work in stream processing or event handlers, with automatic forgetting of old entries.
npm install quick-lrupnpm add quick-lrubun add quick-lru