Map over promises concurrently
p-map is a utility package that enables concurrent execution of promise-returning or async functions over an iterable collection. Unlike Promise.all() which starts all promises immediately, p-map provides fine-grained control over concurrency levels, allowing you to limit how many promises run simultaneously. This prevents resource exhaustion when dealing with large datasets or rate-limited APIs.
The package addresses a common problem in JavaScript: native array methods like map() don't handle async operations well, and Promise.all() lacks concurrency control. When you need to process thousands of items through async operations—like API calls, file operations, or database queries—running them all at once can overwhelm your system or hit rate limits. p-map solves this by managing a pool of concurrent operations.
With over 61 million weekly downloads, p-map has become a fundamental building block in the Node.js ecosystem. It's used extensively in CLI tools, build systems, data processing pipelines, and web scrapers. The package is lightweight, has no dependencies, and provides a simple Promise-based API that integrates seamlessly with modern async/await syntax.
p-map is maintained by Sindre Sorhus, a prolific open-source contributor known for high-quality, well-maintained packages. The library is written in TypeScript, includes comprehensive type definitions, and supports ESM modules. Its reliability and stability make it a safe choice for production applications requiring controlled concurrent promise execution.
import pMap from 'p-map';
// Fetch user data from an API with concurrency control
const userIds = Array.from({ length: 100 }, (_, i) => i + 1);
async function fetchUser(userId) {
const response = await fetch(`https://jsonplaceholder.typicode.com/users/${userId}`);
if (!response.ok) throw new Error(`Failed to fetch user ${userId}`);
return response.json();
}
// Process 100 users but only 5 concurrent requests at a time
const users = await pMap(userIds, fetchUser, { concurrency: 5 });
console.log(`Fetched ${users.length} users`);
// With error handling and custom mapper
const results = await pMap(
['file1.txt', 'file2.txt', 'file3.txt'],
async (filename, index) => {
console.log(`Processing ${filename} (${index + 1}/3)`);
// Simulate file processing
await new Promise(resolve => setTimeout(resolve, 1000));
return { filename, processed: true, timestamp: Date.now() };
},
{
concurrency: 2,
stopOnError: false // Continue processing even if one fails
}
);
console.log('Results:', results);API Rate Limiting: When fetching data from multiple endpoints with rate limits, p-map lets you control concurrency to stay within API quotas. For example, if an API allows 10 requests per second, set concurrency to 10 to maximize throughput without triggering rate limit errors.
File Processing: Processing large numbers of files (image resizing, video transcoding, PDF generation) requires limiting concurrent operations to prevent memory exhaustion. p-map ensures only N files are processed simultaneously while queuing the rest.
Database Operations: When performing bulk database operations like batch inserts, updates, or parallel queries, p-map prevents connection pool exhaustion by controlling how many database operations run concurrently.
Web Scraping: Crawling multiple pages requires respecting server resources and avoiding IP bans. p-map allows controlled parallel requests while maintaining politeness by limiting concurrent connections to any single domain.
Cloud Resource Management: When interacting with cloud services (AWS S3, Azure Blob Storage), p-map helps manage concurrent uploads/downloads efficiently, optimizing bandwidth usage while respecting service quotas and avoiding throttling.
npm install p-mappnpm add p-mapbun add p-map