Async generator for pagination
typescriptA reusable TypeScript async generator function that handles API pagination automatically, fetching pages sequentially and yielding items until all data is retrieved.
Code
interface PaginatedResponse<T> {
data: T[];
nextCursor: string | null;
hasMore: boolean;
}
interface FetchOptions {
pageSize?: number;
headers?: Record<string, string>;
maxPages?: number;
}
async function* fetchAllPages<T>(
baseUrl: string,
options: FetchOptions = {}
): AsyncGenerator<T[], void, unknown> {
const { pageSize = 20, headers = {}, maxPages = Infinity } = options;
let cursor: string | null = null;
let pageCount = 0;
while (pageCount < maxPages) {
const url = new URL(baseUrl);
url.searchParams.set("limit", String(pageSize));
if (cursor) {
url.searchParams.set("cursor", cursor);
}
const response = await fetch(url.toString(), {
headers: {
"Content-Type": "application/json",
...headers,
},
});
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const result: PaginatedResponse<T> = await response.json();
yield result.data;
pageCount++;
if (!result.hasMore || !result.nextCursor) {
break;
}
cursor = result.nextCursor;
}
}
async function* flattenPages<T>(
baseUrl: string,
options: FetchOptions = {}
): AsyncGenerator<T, void, unknown> {
for await (const page of fetchAllPages<T>(baseUrl, options)) {
for (const item of page) {
yield item;
}
}
}
// Usage examples
interface User {
id: number;
name: string;
email: string;
}
async function processAllUsers(): Promise<void> {
const apiUrl = "https://api.example.com/users";
console.log("Processing users page by page:");
let totalUsers = 0;
for await (const userPage of fetchAllPages<User>(apiUrl, { pageSize: 50 })) {
console.log(`Received page with ${userPage.length} users`);
totalUsers += userPage.length;
for (const user of userPage) {
console.log(` - ${user.name} (${user.email})`);
}
}
console.log(`Total users processed: ${totalUsers}`);
}
async function processUsersIndividually(): Promise<void> {
const apiUrl = "https://api.example.com/users";
console.log("Processing users one by one:");
for await (const user of flattenPages<User>(apiUrl, { maxPages: 5 })) {
console.log(`Processing: ${user.name}`);
}
}
async function collectAllUsers(): Promise<User[]> {
const apiUrl = "https://api.example.com/users";
const allUsers: User[] = [];
for await (const page of fetchAllPages<User>(apiUrl)) {
allUsers.push(...page);
}
return allUsers;
}
// Run the examples
processAllUsers().catch(console.error);How It Works
This implementation leverages TypeScript's async generator syntax to create a clean, memory-efficient pagination solution. The fetchAllPages function is declared with async function*, which allows it to both await promises and yield values. Each iteration fetches a single page, yields the data array, and then checks if more pages exist before continuing. This lazy evaluation means pages are only fetched when the consumer requests them through the for-await-of loop.
The generic type parameter <T> makes this function fully reusable across different API endpoints and data types. The PaginatedResponse<T> interface defines a common pagination contract using cursor-based pagination, which is more reliable than offset-based pagination when data changes between requests. The function accepts configuration options for page size, custom headers (useful for authentication), and a maximum page limit to prevent runaway requests.
Error handling is built into the generator—if any fetch fails, the error propagates naturally to the consumer's for-await-of loop where it can be caught with a standard try-catch block. The generator also properly terminates when hasMore is false or when nextCursor is null, ensuring clean resource cleanup.
The flattenPages helper generator demonstrates composition—it wraps fetchAllPages to yield individual items instead of arrays, which is convenient when you don't need batch processing. Both generators maintain backpressure naturally: the next page isn't fetched until the consumer finishes processing the current one, preventing memory issues with large datasets.
Use this pattern when dealing with APIs that return paginated data and you need to process all results without loading everything into memory at once. Avoid it when you need random access to pages or when the API uses a different pagination scheme (like offset/limit without cursors)—in those cases, you'll need to adapt the URL construction logic accordingly.