- Published on
The Node.js Event Loop — A Practical Guide to Avoiding Starvation and Jank
- Authors

- Name
- Sanjeev Sharma
- @webcoderspeed1
Introduction
The event loop is Node.js's engine. Understanding its phases, priorities, and blocking mechanics separates developers who ship responsive systems from those who ship jank. This post maps the event loop's internals and shows how to detect and fix starvation in production.
- The 6 libuv Phases and Their Purpose
- setTimeout vs setImmediate Ordering
- process.nextTick() Priority and Starvation Risk
- Promise Microtask Queue
- Blocking the Event Loop with Sync Code
- --prof Flag and tick-processor for CPU Profiling
- queueMicrotask() Use Cases
- Checklist
- Conclusion
The 6 libuv Phases and Their Purpose
Each event loop iteration proceeds through 6 phases in order. Blocking in any phase prevents others from executing.
// Phase 1: Timers
// Execute callbacks registered with setTimeout/setInterval
setTimeout(() => {
console.log('1. Timers phase');
}, 0);
// Phase 2: Pending callbacks
// Execute I/O callbacks deferred from previous iteration
import fs from 'fs';
fs.readFile('file.txt', () => {
console.log('2. Pending callbacks phase');
});
// Phase 3: Idle/Prepare
// Internal use (rarely relevant)
// Phase 4: Poll phase
// Wait for new I/O events (network, disk, timers)
// If no timers or I/O, the event loop blocks here
// Phase 5: Check
// Execute setImmediate callbacks
setImmediate(() => {
console.log('5. Check phase');
});
// Phase 6: Close callbacks
// Execute close callbacks (stream.destroy, socket.close)
import net from 'net';
const server = net.createServer();
server.on('close', () => {
console.log('6. Close callbacks phase');
});
// Execution order in one iteration:
// 1. Microtask queue (Promises, process.nextTick)
// 2. Timers
// 3. Microtask queue
// 4. Pending callbacks
// 5. Microtask queue
// 6. Idle/Prepare
// 7. Poll (may block here)
// 8. Microtask queue
// 9. Check (setImmediate)
// 10. Microtask queue
// 11. Close callbacks
// 12. Microtask queue
setTimeout vs setImmediate Ordering
The ordering is counter-intuitive. Understand it.
// Executed in poll phase, before check phase
setImmediate(() => {
console.log('Immediate 1');
});
// Executed in timers phase, first in loop
setTimeout(() => {
console.log('Timeout 1');
}, 0);
// Order: Timeout 1 → Immediate 1
// BUT if both are in same phase:
setTimeout(() => {
setImmediate(() => {
console.log('Immediate from timeout');
});
setTimeout(() => {
console.log('Timeout from timeout');
}, 0);
}, 0);
// Now inside timers phase:
// - setTimeout callback executes
// - It schedules setImmediate (for check phase)
// - It schedules setTimeout (for next timers phase)
// Event loop moves to check phase → Immediate from timeout prints
// Next iteration, timers phase → Timeout from timeout prints
// Practical example: sequential operations
function readFilesSequentially(files: string[]): void {
let index = 0;
function readNext(): void {
if (index >= files.length) return;
const file = files[index++];
console.log(`Reading ${file}...`);
setTimeout(() => {
// Simulate file read
console.log(`Done with ${file}`);
readNext(); // Schedule next read for timers phase
}, 0);
}
readNext();
}
// Using setImmediate for check phase scheduling
function processQueue(queue: any[]): void {
if (queue.length === 0) return;
const item = queue.shift();
console.log('Processing:', item);
setImmediate(() => {
processQueue(queue);
});
}
processQueue(['a', 'b', 'c']);
process.nextTick() Priority and Starvation Risk
process.nextTick() runs before any phase. Use it carefully—it can starve the event loop.
// process.nextTick runs BEFORE timers and after microtasks
Promise.resolve().then(() => {
console.log('Promise 1');
});
process.nextTick(() => {
console.log('nextTick 1');
});
setTimeout(() => {
console.log('Timeout 1');
}, 0);
// Order:
// nextTick 1
// Promise 1
// Timeout 1
// DANGEROUS: process.nextTick recursion starves event loop
function dangerousRecursion(): void {
const queue: any[] = Array(1000).fill(0);
function processQueue(): void {
if (queue.length === 0) return;
queue.shift();
process.nextTick(processQueue); // Infinite nextTick chain
}
processQueue();
// This never runs, event loop starved
setTimeout(() => {
console.log('This might never print');
}, 0);
}
// SAFE: use setImmediate instead
function safeRecursion(): void {
const queue: any[] = Array(1000).fill(0);
function processQueue(): void {
if (queue.length === 0) return;
queue.shift();
setImmediate(processQueue); // Releases event loop control
}
processQueue();
// This reliably runs
setTimeout(() => {
console.log('This always prints');
}, 0);
}
// Real example: batching with nextTick
class BatchProcessor {
private queue: any[] = [];
private scheduled = false;
add(item: any): void {
this.queue.push(item);
// Schedule exactly once per event loop iteration
if (!this.scheduled) {
this.scheduled = true;
process.nextTick(() => {
this.flush();
this.scheduled = false;
});
}
}
private flush(): void {
const batch = this.queue.splice(0);
console.log('Processing batch:', batch);
}
}
const processor = new BatchProcessor();
processor.add(1);
processor.add(2);
processor.add(3);
// All three batched together in nextTick
Promise Microtask Queue
Every phase can be interrupted by microtask queue execution (Promises, queueMicrotask).
// Microtasks run after every phase
setTimeout(() => {
console.log('Timers phase');
Promise.resolve().then(() => {
console.log('Microtask after timers');
});
console.log('Still in timers phase');
}, 0);
setImmediate(() => {
console.log('Check phase');
Promise.resolve().then(() => {
console.log('Microtask after check');
});
console.log('Still in check phase');
});
// Order:
// Timers phase
// Still in timers phase
// Microtask after timers
// Check phase
// Still in check phase
// Microtask after check
// Blocking the microtask queue
async function blockMicrotasks(): Promise<void> {
console.log('Start');
for (let i = 0; i < 1000000; i++) {
await Promise.resolve(); // Each adds to microtask queue
}
console.log('Done'); // This takes forever
}
// Better: batch microtasks
async function batchMicrotasks(): Promise<void> {
console.log('Start');
for (let i = 0; i < 1000; i++) {
// 1000 actual awaits
const promises: Promise<void>[] = [];
for (let j = 0; j < 1000; j++) {
promises.push(Promise.resolve());
}
await Promise.all(promises); // Batch them
}
console.log('Done');
}
Blocking the Event Loop with Sync Code
CPU-bound synchronous code blocks the entire event loop. Detect and fix.
// BLOCKING: synchronous crypto
import crypto from 'crypto';
function insecurePassword(password: string): boolean {
// This blocks for ~100ms on each call
const hash = crypto.pbkdfSync(password, 'salt', 100000, 64, 'sha512');
return hash.toString('hex') === 'expected';
}
// Every login request blocks all others
import express from 'express';
const app = express();
app.post('/login', (req, res) => {
const authenticated = insecurePassword(req.body.password);
res.json({ authenticated });
// During verification, no other request can be processed
});
// BETTER: async version
async function securePassword(
password: string
): Promise<boolean> {
return new Promise((resolve, reject) => {
crypto.pbkdf2(
password,
'salt',
100000,
64,
'sha512',
(err, hash) => {
if (err) reject(err);
else resolve(hash.toString('hex') === 'expected');
}
);
});
}
app.post('/login', async (req, res) => {
const authenticated = await securePassword(req.body.password);
res.json({ authenticated });
// Event loop remains responsive to other requests
});
// Detecting blocking: measure event loop lag
let lastCheck = Date.now();
let checks = 0;
let maxLag = 0;
setInterval(() => {
const now = Date.now();
const lag = now - lastCheck - 1000; // Should be ~0
if (lag > 100) {
console.warn(`Event loop lag: ${lag}ms`);
maxLag = Math.max(maxLag, lag);
}
lastCheck = now;
checks++;
}, 1000);
// Expose lag metrics
app.get('/metrics/lag', (req, res) => {
res.json({ maxLag, checks });
});
// Production: use Node.js perf hooks
import perf_hooks from 'perf_hooks';
const obs = new perf_hooks.PerformanceObserver((items) => {
for (const item of items.getEntries()) {
if (item.duration > 100) {
console.warn('Long task detected:', {
name: item.name,
duration: item.duration,
});
}
}
});
obs.observe({ entryTypes: ['measure', 'function'] });
--prof Flag and tick-processor for CPU Profiling
Profile which functions consume CPU time and block the event loop.
# Generate V8 profile
node --prof server.ts
# Make requests, then stop (Ctrl+C)
# This generates isolate-*.log files
# Process the profile
node --prof-process isolate-*.log > profile.txt
cat profile.txt | head -50
In the output, look for:
- High "self time" = function is slow
- High "total time" = function and its callees are slow
// Example: identify slow function
function slowSort(arr: number[]): number[] {
// This is O(n^2) but we're not using sort
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
if (arr[i] > arr[j]) {
[arr[i], arr[j]] = [arr[j], arr[i]];
}
}
}
return arr;
}
// In express handler
app.get('/api/sort', (req, res) => {
const arr = Array(10000)
.fill(0)
.map(() => Math.random() * 1000);
const sorted = slowSort(arr); // This shows up in profile
res.json({ size: sorted.length });
});
// Fix: use native sort (O(n log n))
app.get('/api/sort', (req, res) => {
const arr = Array(10000)
.fill(0)
.map(() => Math.random() * 1000);
const sorted = arr.sort((a, b) => a - b);
res.json({ size: sorted.length });
});
Use async-profiler for real-time profiling in production:
# Requires npm install async-profiler
npm install async-profiler
# Generates flame graphs
queueMicrotask() Use Cases
queueMicrotask() schedules a function in the microtask queue without creating a Promise.
// Microtask before next phase
queueMicrotask(() => {
console.log('In microtask queue');
});
setTimeout(() => {
console.log('In timers phase');
}, 0);
// Order: Microtask queue → Timers phase
// Use case 1: batching DOM updates (browser), or state changes
class StateManager {
private pending = false;
private updates: any[] = [];
setState(value: any): void {
this.updates.push(value);
if (!this.pending) {
this.pending = true;
queueMicrotask(() => {
this.flush();
this.pending = false;
});
}
}
private flush(): void {
const combined = Object.assign({}, ...this.updates);
this.updates = [];
console.log('Flushed state:', combined);
}
}
// Use case 2: error handling before next phase
function safeExecute(fn: () => void): void {
try {
fn();
} catch (err) {
queueMicrotask(() => {
throw err; // Throw after current phase completes
});
}
}
// Use case 3: deferred computation
function deferWork(work: () => void): void {
queueMicrotask(work);
// Cheaper than setImmediate, priority than setTimeout
}
// Benchmark
const iterations = 100000;
console.time('queueMicrotask');
for (let i = 0; i < iterations; i++) {
queueMicrotask(() => {});
}
console.timeEnd('queueMicrotask');
console.time('setImmediate');
for (let i = 0; i < iterations; i++) {
setImmediate(() => {});
}
console.timeEnd('setImmediate');
// queueMicrotask is ~10x faster
Checklist
- ✓ Understand the 6 phases and which callbacks run in each
- ✓ Avoid process.nextTick recursion; use setImmediate instead
- ✓ Never block the event loop with synchronous CPU work
- ✓ Use --prof and tick-processor to identify slow functions
- ✓ Monitor event loop lag with setInterval; alert if > 100ms
- ✓ Batch operations with nextTick to reduce overhead
- ✓ Use queueMicrotask for immediate deferred work
- ✓ Use async versions of crypto, hashing, and sorting functions
Conclusion
The event loop is predictable once you internalize its 6 phases and microtask priorities. Blocking is the enemy—identify and eliminate CPU-bound work. Combined with systematic profiling, you'll debug jank and latency spikes that other teams mysteriously fail to diagnose.