What is the event loop in JavaScript runtimes?
The event loop is concept within the browser runtime environment regarding how asynchronous operations are executed within JavaScript engines. It works as such:
TL;DR
The event loop is a concept within the JavaScript runtime environment regarding how asynchronous operations are executed within JavaScript engines. It works as such:
- The JavaScript engine starts executing scripts, placing synchronous operations on the call stack.
- When an asynchronous operation is encountered (e.g.,
setTimeout(), HTTP request), it is offloaded to the respective Web API or Node.js API to handle the operation in the background. - Once the asynchronous operation completes, its callback function is placed in the respective queues – task queues (also known as macrotask queues / callback queues) or microtask queues. We will refer to "task queue" as "macrotask queue" from here on to better differentiate from the microtask queue.
- The event loop continuously monitors the call stack and executes items on the call stack. If/when the call stack is empty:
- Microtask queue is processed. Microtasks include promise callbacks (
then,catch,finally),awaitcontinuations,MutationObservercallbacks, and calls toqueueMicrotask(). The event loop takes the first callback from the microtask queue and pushes it to the call stack for execution. This repeats until the microtask queue is empty. - Macrotask queue is processed. Macrotasks include web APIs like
setTimeout(), HTTP requests, user interface event handlers like clicks, scrolls, etc. The event loop dequeues the first callback from the macrotask queue and pushes it onto the call stack for execution. However, after a macrotask queue callback is processed, the event loop does not proceed with the next macrotask yet! The event loop first checks the microtask queue. Checking the microtask queue is necessary as microtasks have higher priority than macrotask queue callbacks. The macrotask queue callback that was just executed could have added more microtasks!- If the microtask queue is non-empty, process them as per the previous step.
- If the microtask queue is empty, the next macrotask queue callback is processed. This repeats until the macrotask queue is empty.
- Microtask queue is processed. Microtasks include promise callbacks (
- This process continues indefinitely, allowing the JavaScript engine to handle both synchronous and asynchronous operations efficiently without blocking the call stack.
Event loop in JavaScript
The event loop is the mechanism that lets JavaScript handle asynchronous operations without blocking its single-threaded execution.
Parts of the event loop
To understand it better, we need to understand all the parts of the system. These components are part of the event loop:
Call stack
The call stack keeps track of the functions being executed in a program. When a function is called, it is added to the top of the call stack. When the function completes, it is removed from the call stack. This allows the program to keep track of where it is in the execution of a function and return to the correct location when the function completes. As the name suggests, it is a stack data structure which follows last-in-first-out.
Web APIs/Node.js APIs
Asynchronous operations like setTimeout(), HTTP requests, file I/O, etc., are handled by Web APIs (in the browser) or C++ APIs (in Node.js). These APIs are not part of the JavaScript engine and run on separate threads, allowing them to execute concurrently without blocking the call stack.
Task queue / Macrotask queue / Callback queue
The macrotask queue (also called the task queue, callback queue, or event queue) holds callbacks waiting to run when the call stack and microtask queue are empty.
Microtasks queue
The microtask queue holds higher-priority callbacks that drain after the call stack empties and between every macrotask.
Event loop order
- The JavaScript engine starts executing scripts, placing synchronous operations on the call stack.
- When an asynchronous operation is encountered (e.g.,
setTimeout(), HTTP request), it is offloaded to the respective Web API or Node.js API to handle the operation in the background. - Once the asynchronous operation completes, its callback function is placed in the respective queues – task queues (also known as macrotask queues / callback queues) or microtask queues. We will refer to "task queue" as "macrotask queue" from here on to better differentiate from the microtask queue.
- The event loop continuously monitors the call stack and executes items on the call stack. If/when the call stack is empty:
- Microtask queue is processed. The event loop takes the first callback from the microtask queue and pushes it to the call stack for execution. This repeats until the microtask queue is empty.
- Macrotask queue is processed. The event loop dequeues the first callback from the macrotask queue and pushes it onto the call stack for execution. However, after a macrotask queue callback is processed, the event loop does not proceed with the next macrotask yet! The event loop first checks the microtask queue. Checking the microtask queue is necessary as microtasks have higher priority than macrotask queue callbacks. The macrotask queue callback that was just executed could have added more microtasks!
- If the microtask queue is non-empty, process them as per the previous step.
- If the microtask queue is empty, the next macrotask queue callback is processed. This repeats until the macrotask queue is empty.
- This process continues indefinitely, allowing the JavaScript engine to handle both synchronous and asynchronous operations efficiently without blocking the call stack.
Example
The example below mixes synchronous logs with two timer callbacks and two promise callbacks. The first timer's callback enqueues a microtask, and the first promise callback enqueues another timer — small additions that exercise every ordering rule the event loop applies, while keeping each line individually trivial to read.
console.log('Start');
setTimeout(() => {
console.log('Timeout 1');
Promise.resolve().then(() => console.log('Promise 2'));
}, 0);
Promise.resolve().then(() => {
console.log('Promise 1');
setTimeout(() => console.log('Timeout 3'), 0);
});
setTimeout(() => console.log('Timeout 2'), 0);
console.log('End');
// Console output:
// Start
// End
// Promise 1
// Timeout 1
// Promise 2
// Timeout 2
// Timeout 3Queue entries in the trace below are labeled by the message their callback will log (so [Promise 1] means "the queued callback that will log Promise 1"). Names match registration order: Timeout 1 is the first timer registered, Promise 2 is the microtask scheduled later by Timeout 1's callback, and so on.
| Step | What just happened | Call stack | Microtask queue | Macrotask queue | Output |
|---|---|---|---|---|---|
| 1 | console.log('Start') runs | empty | empty | empty | Start |
| 2 | The first setTimeout registers a timer with the Web API | empty | empty | empty | Start |
| 3 | Promise.resolve().then(...) enqueues its callback as a microtask | empty | [Promise 1] | empty | Start |
| 4 | The second setTimeout registers another timer | empty | [Promise 1] | empty | Start |
| 5 | console.log('End') runs; sync script finishes. Both 0 ms timers have elapsed and their callbacks have moved from the Web API into the macrotask queue, in registration order | empty | [Promise 1] | [Timeout 1, Timeout 2] | Start, End |
| 6 | Stack empty → microtask queue drains: Promise 1 runs and logs, then schedules a new timer whose callback will log Timeout 3. The new macrotask is appended to the end of the macrotask queue | empty | empty | [Timeout 1, Timeout 2, Timeout 3] | …, Promise 1 |
| 7 | Microtask queue empty → one macrotask runs: Timeout 1 logs, then enqueues a new microtask that will log Promise 2 | empty | [Promise 2] | [Timeout 2, Timeout 3] | …, Timeout 1 |
| 8 | Microtask queue is re-checked before the next macrotask (non-empty → drain): Promise 2 runs and logs | empty | empty | [Timeout 2, Timeout 3] | …, Promise 2 |
| 9 | Microtask queue empty → next macrotask: Timeout 2 runs and logs | empty | empty | [Timeout 3] | …, Timeout 2 |
| 10 | Microtask queue re-checked (empty) → next macrotask: Timeout 3 runs and logs | empty | empty | empty | …, Timeout 3 |
Three rules the trace makes explicit:
- Microtasks drain before any macrotask. Step 6 runs
Promise 1before either timer, even though both timers were scheduled before the promise callback ran. - A macrotask that schedules a microtask interleaves. Step 7 runs
Timeout 1and enqueuesPromise 2; step 8 runsPromise 2before the next macrotask, not after. The event loop re-checks the microtask queue between every macrotask, which is why a single drain at the end of synchronous code is not enough to model behavior correctly. - A microtask that schedules a macrotask appends to the queue. Step 6 runs
Promise 1and schedulesTimeout 3;Timeout 3then runs last, after both timers that were already in the macrotask queue. Microtasks cannot promote a macrotask to the front of the line.
Advanced examples
The examples below demonstrate event loop behaviors that commonly appear in production code and more advanced interview questions.
async/await scheduling
async/await is specified in terms of promise chaining. When execution reaches an await, the function is paused, its continuation is scheduled as a microtask on resolution of the awaited value, and control returns to the caller.
console.log('1');
async function run() {
console.log('2');
await Promise.resolve();
console.log('3');
}
run();
setTimeout(() => console.log('4'), 0);
Promise.resolve().then(() => console.log('5'));
console.log('6');
// Output: 1, 2, 6, 3, 5, 4Explanation:
1is logged from the first synchronous statement.run()is invoked. Synchronous code in the function runs up to theawait, logging2.- The continuation of
run()(everything after theawait) is scheduled as a microtask. Control returns to the top-level script. setTimeoutschedules a macrotask.Promise.resolve().then(...)schedules a microtask.6is logged from the last synchronous statement.- The script completes and the microtask queue drains in FIFO order:
run()'s continuation logs3, then the.thencallback logs5. - The macrotask queue is then processed, logging
4.
The common misconception is that await blocks execution. It does not — the function is paused, but control returns immediately to the caller, and the continuation runs as a microtask once the awaited value settles.
Microtask starvation
Macrotasks run only once the microtask queue has fully drained. If microtasks continually schedule more microtasks, the macrotask queue never advances, which prevents rendering, user input handling, and timer callbacks from running.
let count = 0;
function scheduleMicrotask() {
Promise.resolve().then(() => {
count++;
if (count < 5) scheduleMicrotask();
console.log('microtask', count);
});
}
setTimeout(() => console.log('macrotask fired'), 0);
scheduleMicrotask();
// Output: microtask 1, microtask 2, microtask 3, microtask 4, microtask 5, macrotask firedWith a bounded recursion depth, the macrotask eventually runs. An unbounded chain (for example if (true) instead of if (count < 5)) would prevent any macrotask from running and would block rendering in the browser.
To yield to the browser for rendering or input handling, a macrotask is required — for example setTimeout(fn, 0), MessageChannel, or scheduler.yield() in environments that support it. A microtask such as queueMicrotask or Promise.resolve().then does not yield.
Yielding the main thread to split long tasks
A synchronous block that runs longer than 50 ms is classified as a long task and blocks the browser from rendering, handling input, and processing timers for that duration. The fix is to break the work into chunks and yield to the event loop between chunks so that rendering and other macrotasks can run.
A loop that runs as one task — the entire computation blocks until it finishes:
function heavyWork() {
let sum = 0;
for (let i = 0; i < 1e8; i++) sum += i;
return sum;
}
heavyWork(); // ~hundreds of ms; the page is unresponsive for the durationThe same work split across macrotasks via setTimeout:
function chunked(total, chunkSize, onDone) {
let i = 0;
let sum = 0;
function tick() {
const end = Math.min(i + chunkSize, total);
while (i < end) {
sum += i;
i++;
}
if (i < total) {
setTimeout(tick, 0);
} else {
onDone(sum);
}
}
tick();
}
chunked(1e7, 1e6, (sum) => console.log('done', sum));Between every chunk, the browser can paint a frame, dispatch input events, and run other macrotasks. The drawback is that the HTML specification clamps nested setTimeout delays to a minimum of 4 ms after 5 levels of recursion, which adds noticeable latency to long chunked computations.
MessageChannel schedules a macrotask without that clamp:
function yieldToMain() {
return new Promise((resolve) => {
const channel = new MessageChannel();
channel.port1.onmessage = () => resolve();
channel.port2.postMessage(null);
});
}
async function chunked(total, chunkSize) {
let i = 0;
let sum = 0;
while (i < total) {
const end = Math.min(i + chunkSize, total);
while (i < end) {
sum += i;
i++;
}
if (i < total) await yieldToMain();
}
return sum;
}
chunked(1e7, 1e6).then((sum) => console.log('done', sum));postMessage enqueues a macrotask immediately without delay clamping, so the next chunk runs as soon as the browser has finished its render and any earlier pending tasks. React's scheduler used this pattern before scheduler.postTask was widely available. Production code should reuse a single MessageChannel instance instead of creating one per yield.
A comparison of the available yielding mechanisms:
| Mechanism | Schedule type | Yields to render? | Notes |
|---|---|---|---|
queueMicrotask / Promise.then | Microtask | No | Drains before render — used for sequencing, not yielding |
setTimeout(fn, 0) | Macrotask | Yes | Clamped to ≥ 4 ms after 5 nested calls per the HTML specification |
MessageChannel.postMessage | Macrotask | Yes | No clamp; ~ 0 ms in practice |
scheduler.postTask(fn, { priority }) | Macrotask | Yes | Built-in priority levels (user-blocking, user-visible, background); Chromium-only |
scheduler.yield() | Macrotask | Yes | Returns a promise that resolves on the next yield; preserves task continuation priority; Chromium-only |
Microtasks cannot be used to yield. They drain before rendering, which is the behavior the microtask-starvation example demonstrates.
queueMicrotask compared to Promise.resolve().then
Both schedule a microtask in the same FIFO queue and run at the same point in the event loop. They differ in how exceptions thrown inside the callback are surfaced.
queueMicrotask(() => {
throw new Error('from queueMicrotask');
});
Promise.resolve().then(() => {
throw new Error('from promise.then');
});
setTimeout(() => console.log('timeout ran'), 0);- An exception thrown from a
queueMicrotaskcallback is reported as an uncaught error and reacheswindow.onerror(in browsers) oruncaughtException(in Node). - An exception thrown from a
.thencallback causes the resulting promise to reject, surfacing throughunhandledrejectionif no downstream.catchhandles it.
queueMicrotask is appropriate when the callback is conceptually standalone and its errors should behave like any other thrown exception. .then is appropriate when the callback is part of a promise chain where errors are expected to be caught downstream.
Differences across runtimes
The event loop is specified differently in browsers, Node.js, and Web Workers. Code that relies on precise scheduling may behave differently across these environments.
Browsers
Specified in the HTML Living Standard:
- Each agent has its own event loop.
- The macrotask queue is partitioned into multiple task sources (timers, network I/O, UI events,
postMessage, and others). FIFO order is guaranteed within a source but not across sources — the user agent may choose any non-empty source each turn. requestAnimationFramecallbacks run in a separate phase of the event loop, before the render step, rather than on the macrotask queue.- Rendering (style, layout, paint) occurs between macrotasks, not between microtasks. This is why a long microtask chain can freeze the UI.
Where requestAnimationFrame and requestIdleCallback fit
Within a single iteration of the event loop, the browser visits these phases in order:
- Run one task from a macrotask queue source.
- Drain the microtask queue (including any microtasks scheduled by step 1).
- If a render is due this turn, run all
requestAnimationFramecallbacks queued for the next frame. - Style, layout, and paint.
- During any remaining idle time before the next frame deadline, run
requestIdleCallbackcallbacks.
requestAnimationFrame schedules work for the next paint, making it the right tool for visual updates synchronized with the display refresh rate (~ 16.7 ms per frame at 60 Hz). requestIdleCallback schedules work for the period after rendering and only if the browser has idle time, making it suitable for non-urgent background work.
console.log('1: sync');
queueMicrotask(() => console.log('2: microtask'));
setTimeout(() => console.log('3: macrotask'), 0);
requestAnimationFrame(() => console.log('4: rAF'));
typeof requestIdleCallback === 'function' &&
requestIdleCallback(() => console.log('5: rIC'));
console.log('6: sync');
// Typical output: 1, 6, 2, 3, 4, 5
// `5: rIC` may run later or be deferred under loadsetTimeout(fn, 0) typically logs before the requestAnimationFrame callback because the timer's macrotask is dispatched on the next event loop turn, while rAF waits for the next paint (often a few milliseconds later at typical refresh rates). requestIdleCallback runs only after the browser finishes rendering, which is why it appears last and is the only callback in this example that may be deferred.
Node.js
Built on libuv, with additional phases beyond what the HTML spec describes:
process.nextTick()has a higher-priority queue that drains before the promise microtask queue on every phase transition.- Macrotasks are divided into named phases: timers, pending callbacks, idle/prepare, poll (I/O), check (
setImmediate), and close callbacks. Phases run in order, and microtasks together withnextTickdrain between each. - At the top level of a script, the execution order of
setImmediate(fn)andsetTimeout(fn, 0)is not deterministic and depends on loop timing. Inside an I/O callback,setImmediateis guaranteed to run beforesetTimeout(fn, 0).
A comparison of the Node-specific scheduling primitives:
| Mechanism | Queue | Runs at | Notes |
|---|---|---|---|
process.nextTick(fn) | nextTick queue | Every phase transition, before the promise microtask queue | Highest priority — recursive use can starve I/O |
queueMicrotask(fn) / Promise.then | Microtask queue | Every phase transition, after the nextTick queue drains | Same semantics as in browsers |
setImmediate(fn) | Check phase | Once per loop iteration, after the poll (I/O) phase | Use to defer work until after the current I/O cycle |
setTimeout(fn, 0) | Timers phase | At the top of the next loop iteration once the delay elapses | Minimum delay clamped to 1 ms |
Observed ordering at the top of a script:
setImmediate(() => console.log('setImmediate'));
setTimeout(() => console.log('setTimeout'), 0);
Promise.resolve().then(() => console.log('promise'));
process.nextTick(() => console.log('nextTick'));
// Output:
// nextTick
// promise
// setTimeout (or setImmediate — order between these two is not guaranteed at the top level)
// setImmediate (or setTimeout)Web Workers
- Each Worker has an independent event loop, with its own microtask and macrotask queues.
- Messages posted via
postMessageare enqueued as macrotasks on the receiving Worker's event loop. - No access to
requestAnimationFrameor the DOM.
Common misconceptions
Several statements about the event loop appear frequently in explanations and in responses generated by large language models but are inaccurate:
- "
setTimeout(fn, 0)runs immediately after the current synchronous code." Microtasks drain first. APromise.resolve().then(fn)scheduled after asetTimeout(fn, 0)still runs before the timer callback. - "
awaitblocks the event loop." Theawaitexpression pauses the containing async function and returns control to the caller. The continuation is scheduled as a microtask and does not block other tasks. - "Microtasks run on a separate thread." JavaScript execution is single-threaded. Microtasks run on the main thread, interleaved with macrotasks under the event loop's scheduling rules.
- "
Promise.resolve()is synchronous when the promise is already resolved." The resolution is synchronous, but.thencallbacks are always scheduled asynchronously as microtasks. This is a Promises/A+ requirement intended to guarantee consistent execution ordering. - "
process.nextTickis a microtask." In Node.js,nextTickhas its own queue that drains before the promise microtask queue. - "
setTimeout(fn, 0)fires after 0 milliseconds." Both the HTML specification and Node.js clamp the minimum to a small non-zero value (4ms for nested timers in browsers; 1ms in Node). A delay of0is a lower bound, not a guarantee.