Section

General

Priority

High

Difficulty

Hard

Duration

5 min

What is the event loop in JavaScript runtimes?

The event loop is concept within the browser runtime environment regarding how asynchronous operations are executed within JavaScript engines. It works as such:

Javascript

TL;DR

The event loop is a concept within the JavaScript runtime environment regarding how asynchronous operations are executed within JavaScript engines. It works as such:

  1. The JavaScript engine starts executing scripts, placing synchronous operations on the call stack.
  2. When an asynchronous operation is encountered (e.g., setTimeout(), HTTP request), it is offloaded to the respective Web API or Node.js API to handle the operation in the background.
  3. Once the asynchronous operation completes, its callback function is placed in the respective queues – task queues (also known as macrotask queues / callback queues) or microtask queues. We will refer to "task queue" as "macrotask queue" from here on to better differentiate from the microtask queue.
  4. The event loop continuously monitors the call stack and executes items on the call stack. If/when the call stack is empty:
    1. Microtask queue is processed. Microtasks include promise callbacks (then, catch, finally), await continuations, MutationObserver callbacks, and calls to queueMicrotask(). The event loop takes the first callback from the microtask queue and pushes it to the call stack for execution. This repeats until the microtask queue is empty.
    2. Macrotask queue is processed. Macrotasks include web APIs like setTimeout(), HTTP requests, user interface event handlers like clicks, scrolls, etc. The event loop dequeues the first callback from the macrotask queue and pushes it onto the call stack for execution. However, after a macrotask queue callback is processed, the event loop does not proceed with the next macrotask yet! The event loop first checks the microtask queue. Checking the microtask queue is necessary as microtasks have higher priority than macrotask queue callbacks. The macrotask queue callback that was just executed could have added more microtasks!
      1. If the microtask queue is non-empty, process them as per the previous step.
      2. If the microtask queue is empty, the next macrotask queue callback is processed. This repeats until the macrotask queue is empty.
  5. This process continues indefinitely, allowing the JavaScript engine to handle both synchronous and asynchronous operations efficiently without blocking the call stack.

Event loop in JavaScript

The event loop is the mechanism that lets JavaScript handle asynchronous operations without blocking its single-threaded execution.

Parts of the event loop

To understand it better, we need to understand all the parts of the system. These components are part of the event loop:

Call stack

The call stack keeps track of the functions being executed in a program. When a function is called, it is added to the top of the call stack. When the function completes, it is removed from the call stack. This allows the program to keep track of where it is in the execution of a function and return to the correct location when the function completes. As the name suggests, it is a stack data structure which follows last-in-first-out.

Web APIs/Node.js APIs

Asynchronous operations like setTimeout(), HTTP requests, file I/O, etc., are handled by Web APIs (in the browser) or C++ APIs (in Node.js). These APIs are not part of the JavaScript engine and run on separate threads, allowing them to execute concurrently without blocking the call stack.

Task queue / Macrotask queue / Callback queue

The macrotask queue (also called the task queue, callback queue, or event queue) holds callbacks waiting to run when the call stack and microtask queue are empty.

Microtasks queue

The microtask queue holds higher-priority callbacks that drain after the call stack empties and between every macrotask.

Event loop order

  1. The JavaScript engine starts executing scripts, placing synchronous operations on the call stack.
  2. When an asynchronous operation is encountered (e.g., setTimeout(), HTTP request), it is offloaded to the respective Web API or Node.js API to handle the operation in the background.
  3. Once the asynchronous operation completes, its callback function is placed in the respective queues – task queues (also known as macrotask queues / callback queues) or microtask queues. We will refer to "task queue" as "macrotask queue" from here on to better differentiate from the microtask queue.
  4. The event loop continuously monitors the call stack and executes items on the call stack. If/when the call stack is empty:
    1. Microtask queue is processed. The event loop takes the first callback from the microtask queue and pushes it to the call stack for execution. This repeats until the microtask queue is empty.
    2. Macrotask queue is processed. The event loop dequeues the first callback from the macrotask queue and pushes it onto the call stack for execution. However, after a macrotask queue callback is processed, the event loop does not proceed with the next macrotask yet! The event loop first checks the microtask queue. Checking the microtask queue is necessary as microtasks have higher priority than macrotask queue callbacks. The macrotask queue callback that was just executed could have added more microtasks!
      1. If the microtask queue is non-empty, process them as per the previous step.
      2. If the microtask queue is empty, the next macrotask queue callback is processed. This repeats until the macrotask queue is empty.
  5. This process continues indefinitely, allowing the JavaScript engine to handle both synchronous and asynchronous operations efficiently without blocking the call stack.

Example

The example below mixes synchronous logs with two timer callbacks and two promise callbacks. The first timer's callback enqueues a microtask, and the first promise callback enqueues another timer — small additions that exercise every ordering rule the event loop applies, while keeping each line individually trivial to read.

console.log('Start');

setTimeout(() => {
  console.log('Timeout 1');
  Promise.resolve().then(() => console.log('Promise 2'));
}, 0);

Promise.resolve().then(() => {
  console.log('Promise 1');
  setTimeout(() => console.log('Timeout 3'), 0);
});

setTimeout(() => console.log('Timeout 2'), 0);

console.log('End');

// Console output:
// Start
// End
// Promise 1
// Timeout 1
// Promise 2
// Timeout 2
// Timeout 3

Queue entries in the trace below are labeled by the message their callback will log (so [Promise 1] means "the queued callback that will log Promise 1"). Names match registration order: Timeout 1 is the first timer registered, Promise 2 is the microtask scheduled later by Timeout 1's callback, and so on.

StepWhat just happenedCall stackMicrotask queueMacrotask queueOutput
1console.log('Start') runsemptyemptyemptyStart
2The first setTimeout registers a timer with the Web APIemptyemptyemptyStart
3Promise.resolve().then(...) enqueues its callback as a microtaskempty[Promise 1]emptyStart
4The second setTimeout registers another timerempty[Promise 1]emptyStart
5console.log('End') runs; sync script finishes. Both 0 ms timers have elapsed and their callbacks have moved from the Web API into the macrotask queue, in registration orderempty[Promise 1][Timeout 1, Timeout 2]Start, End
6Stack empty → microtask queue drains: Promise 1 runs and logs, then schedules a new timer whose callback will log Timeout 3. The new macrotask is appended to the end of the macrotask queueemptyempty[Timeout 1, Timeout 2, Timeout 3]…, Promise 1
7Microtask queue empty → one macrotask runs: Timeout 1 logs, then enqueues a new microtask that will log Promise 2empty[Promise 2][Timeout 2, Timeout 3]…, Timeout 1
8Microtask queue is re-checked before the next macrotask (non-empty → drain): Promise 2 runs and logsemptyempty[Timeout 2, Timeout 3]…, Promise 2
9Microtask queue empty → next macrotask: Timeout 2 runs and logsemptyempty[Timeout 3]…, Timeout 2
10Microtask queue re-checked (empty) → next macrotask: Timeout 3 runs and logsemptyemptyempty…, Timeout 3

Three rules the trace makes explicit:

  • Microtasks drain before any macrotask. Step 6 runs Promise 1 before either timer, even though both timers were scheduled before the promise callback ran.
  • A macrotask that schedules a microtask interleaves. Step 7 runs Timeout 1 and enqueues Promise 2; step 8 runs Promise 2 before the next macrotask, not after. The event loop re-checks the microtask queue between every macrotask, which is why a single drain at the end of synchronous code is not enough to model behavior correctly.
  • A microtask that schedules a macrotask appends to the queue. Step 6 runs Promise 1 and schedules Timeout 3; Timeout 3 then runs last, after both timers that were already in the macrotask queue. Microtasks cannot promote a macrotask to the front of the line.

Advanced examples

The examples below demonstrate event loop behaviors that commonly appear in production code and more advanced interview questions.

async/await scheduling

async/await is specified in terms of promise chaining. When execution reaches an await, the function is paused, its continuation is scheduled as a microtask on resolution of the awaited value, and control returns to the caller.

console.log('1');

async function run() {
  console.log('2');
  await Promise.resolve();
  console.log('3');
}

run();

setTimeout(() => console.log('4'), 0);

Promise.resolve().then(() => console.log('5'));

console.log('6');

// Output: 1, 2, 6, 3, 5, 4

Explanation:

  1. 1 is logged from the first synchronous statement.
  2. run() is invoked. Synchronous code in the function runs up to the await, logging 2.
  3. The continuation of run() (everything after the await) is scheduled as a microtask. Control returns to the top-level script.
  4. setTimeout schedules a macrotask.
  5. Promise.resolve().then(...) schedules a microtask.
  6. 6 is logged from the last synchronous statement.
  7. The script completes and the microtask queue drains in FIFO order: run()'s continuation logs 3, then the .then callback logs 5.
  8. The macrotask queue is then processed, logging 4.

The common misconception is that await blocks execution. It does not — the function is paused, but control returns immediately to the caller, and the continuation runs as a microtask once the awaited value settles.

Microtask starvation

Macrotasks run only once the microtask queue has fully drained. If microtasks continually schedule more microtasks, the macrotask queue never advances, which prevents rendering, user input handling, and timer callbacks from running.

let count = 0;

function scheduleMicrotask() {
  Promise.resolve().then(() => {
    count++;
    if (count < 5) scheduleMicrotask();
    console.log('microtask', count);
  });
}

setTimeout(() => console.log('macrotask fired'), 0);

scheduleMicrotask();

// Output: microtask 1, microtask 2, microtask 3, microtask 4, microtask 5, macrotask fired

With a bounded recursion depth, the macrotask eventually runs. An unbounded chain (for example if (true) instead of if (count < 5)) would prevent any macrotask from running and would block rendering in the browser.

To yield to the browser for rendering or input handling, a macrotask is required — for example setTimeout(fn, 0), MessageChannel, or scheduler.yield() in environments that support it. A microtask such as queueMicrotask or Promise.resolve().then does not yield.

Yielding the main thread to split long tasks

A synchronous block that runs longer than 50 ms is classified as a long task and blocks the browser from rendering, handling input, and processing timers for that duration. The fix is to break the work into chunks and yield to the event loop between chunks so that rendering and other macrotasks can run.

A loop that runs as one task — the entire computation blocks until it finishes:

function heavyWork() {
  let sum = 0;
  for (let i = 0; i < 1e8; i++) sum += i;
  return sum;
}

heavyWork(); // ~hundreds of ms; the page is unresponsive for the duration

The same work split across macrotasks via setTimeout:

function chunked(total, chunkSize, onDone) {
  let i = 0;
  let sum = 0;
  function tick() {
    const end = Math.min(i + chunkSize, total);
    while (i < end) {
      sum += i;
      i++;
    }
    if (i < total) {
      setTimeout(tick, 0);
    } else {
      onDone(sum);
    }
  }
  tick();
}

chunked(1e7, 1e6, (sum) => console.log('done', sum));

Between every chunk, the browser can paint a frame, dispatch input events, and run other macrotasks. The drawback is that the HTML specification clamps nested setTimeout delays to a minimum of 4 ms after 5 levels of recursion, which adds noticeable latency to long chunked computations.

MessageChannel schedules a macrotask without that clamp:

function yieldToMain() {
  return new Promise((resolve) => {
    const channel = new MessageChannel();
    channel.port1.onmessage = () => resolve();
    channel.port2.postMessage(null);
  });
}

async function chunked(total, chunkSize) {
  let i = 0;
  let sum = 0;
  while (i < total) {
    const end = Math.min(i + chunkSize, total);
    while (i < end) {
      sum += i;
      i++;
    }
    if (i < total) await yieldToMain();
  }
  return sum;
}

chunked(1e7, 1e6).then((sum) => console.log('done', sum));

postMessage enqueues a macrotask immediately without delay clamping, so the next chunk runs as soon as the browser has finished its render and any earlier pending tasks. React's scheduler used this pattern before scheduler.postTask was widely available. Production code should reuse a single MessageChannel instance instead of creating one per yield.

A comparison of the available yielding mechanisms:

MechanismSchedule typeYields to render?Notes
queueMicrotask / Promise.thenMicrotaskNoDrains before render — used for sequencing, not yielding
setTimeout(fn, 0)MacrotaskYesClamped to ≥ 4 ms after 5 nested calls per the HTML specification
MessageChannel.postMessageMacrotaskYesNo clamp; ~ 0 ms in practice
scheduler.postTask(fn, { priority })MacrotaskYesBuilt-in priority levels (user-blocking, user-visible, background); Chromium-only
scheduler.yield()MacrotaskYesReturns a promise that resolves on the next yield; preserves task continuation priority; Chromium-only

Microtasks cannot be used to yield. They drain before rendering, which is the behavior the microtask-starvation example demonstrates.

queueMicrotask compared to Promise.resolve().then

Both schedule a microtask in the same FIFO queue and run at the same point in the event loop. They differ in how exceptions thrown inside the callback are surfaced.

queueMicrotask(() => {
  throw new Error('from queueMicrotask');
});

Promise.resolve().then(() => {
  throw new Error('from promise.then');
});

setTimeout(() => console.log('timeout ran'), 0);
  • An exception thrown from a queueMicrotask callback is reported as an uncaught error and reaches window.onerror (in browsers) or uncaughtException (in Node).
  • An exception thrown from a .then callback causes the resulting promise to reject, surfacing through unhandledrejection if no downstream .catch handles it.

queueMicrotask is appropriate when the callback is conceptually standalone and its errors should behave like any other thrown exception. .then is appropriate when the callback is part of a promise chain where errors are expected to be caught downstream.

Differences across runtimes

The event loop is specified differently in browsers, Node.js, and Web Workers. Code that relies on precise scheduling may behave differently across these environments.

Browsers

Specified in the HTML Living Standard:

  • Each agent has its own event loop.
  • The macrotask queue is partitioned into multiple task sources (timers, network I/O, UI events, postMessage, and others). FIFO order is guaranteed within a source but not across sources — the user agent may choose any non-empty source each turn.
  • requestAnimationFrame callbacks run in a separate phase of the event loop, before the render step, rather than on the macrotask queue.
  • Rendering (style, layout, paint) occurs between macrotasks, not between microtasks. This is why a long microtask chain can freeze the UI.

Where requestAnimationFrame and requestIdleCallback fit

Within a single iteration of the event loop, the browser visits these phases in order:

  1. Run one task from a macrotask queue source.
  2. Drain the microtask queue (including any microtasks scheduled by step 1).
  3. If a render is due this turn, run all requestAnimationFrame callbacks queued for the next frame.
  4. Style, layout, and paint.
  5. During any remaining idle time before the next frame deadline, run requestIdleCallback callbacks.

requestAnimationFrame schedules work for the next paint, making it the right tool for visual updates synchronized with the display refresh rate (~ 16.7 ms per frame at 60 Hz). requestIdleCallback schedules work for the period after rendering and only if the browser has idle time, making it suitable for non-urgent background work.

console.log('1: sync');
queueMicrotask(() => console.log('2: microtask'));
setTimeout(() => console.log('3: macrotask'), 0);
requestAnimationFrame(() => console.log('4: rAF'));
typeof requestIdleCallback === 'function' &&
  requestIdleCallback(() => console.log('5: rIC'));
console.log('6: sync');

// Typical output: 1, 6, 2, 3, 4, 5
// `5: rIC` may run later or be deferred under load

setTimeout(fn, 0) typically logs before the requestAnimationFrame callback because the timer's macrotask is dispatched on the next event loop turn, while rAF waits for the next paint (often a few milliseconds later at typical refresh rates). requestIdleCallback runs only after the browser finishes rendering, which is why it appears last and is the only callback in this example that may be deferred.

Node.js

Built on libuv, with additional phases beyond what the HTML spec describes:

  • process.nextTick() has a higher-priority queue that drains before the promise microtask queue on every phase transition.
  • Macrotasks are divided into named phases: timers, pending callbacks, idle/prepare, poll (I/O), check (setImmediate), and close callbacks. Phases run in order, and microtasks together with nextTick drain between each.
  • At the top level of a script, the execution order of setImmediate(fn) and setTimeout(fn, 0) is not deterministic and depends on loop timing. Inside an I/O callback, setImmediate is guaranteed to run before setTimeout(fn, 0).

A comparison of the Node-specific scheduling primitives:

MechanismQueueRuns atNotes
process.nextTick(fn)nextTick queueEvery phase transition, before the promise microtask queueHighest priority — recursive use can starve I/O
queueMicrotask(fn) / Promise.thenMicrotask queueEvery phase transition, after the nextTick queue drainsSame semantics as in browsers
setImmediate(fn)Check phaseOnce per loop iteration, after the poll (I/O) phaseUse to defer work until after the current I/O cycle
setTimeout(fn, 0)Timers phaseAt the top of the next loop iteration once the delay elapsesMinimum delay clamped to 1 ms

Observed ordering at the top of a script:

setImmediate(() => console.log('setImmediate'));
setTimeout(() => console.log('setTimeout'), 0);
Promise.resolve().then(() => console.log('promise'));
process.nextTick(() => console.log('nextTick'));

// Output:
// nextTick
// promise
// setTimeout (or setImmediate — order between these two is not guaranteed at the top level)
// setImmediate (or setTimeout)

Web Workers

  • Each Worker has an independent event loop, with its own microtask and macrotask queues.
  • Messages posted via postMessage are enqueued as macrotasks on the receiving Worker's event loop.
  • No access to requestAnimationFrame or the DOM.

Common misconceptions

Several statements about the event loop appear frequently in explanations and in responses generated by large language models but are inaccurate:

  1. "setTimeout(fn, 0) runs immediately after the current synchronous code." Microtasks drain first. A Promise.resolve().then(fn) scheduled after a setTimeout(fn, 0) still runs before the timer callback.
  2. "await blocks the event loop." The await expression pauses the containing async function and returns control to the caller. The continuation is scheduled as a microtask and does not block other tasks.
  3. "Microtasks run on a separate thread." JavaScript execution is single-threaded. Microtasks run on the main thread, interleaved with macrotasks under the event loop's scheduling rules.
  4. "Promise.resolve() is synchronous when the promise is already resolved." The resolution is synchronous, but .then callbacks are always scheduled asynchronously as microtasks. This is a Promises/A+ requirement intended to guarantee consistent execution ordering.
  5. "process.nextTick is a microtask." In Node.js, nextTick has its own queue that drains before the promise microtask queue.
  6. "setTimeout(fn, 0) fires after 0 milliseconds." Both the HTML specification and Node.js clamp the minimum to a small non-zero value (4ms for nested timers in browsers; 1ms in Node). A delay of 0 is a lower bound, not a guarantee.

Further reading and resources