Monday, September 8, 2025

Javascript Module 62

  Module 62 : Event Loop and Concurrency.

What “concurrency” means in JS









JavaScript runs one thing at a time on a single call stack.

Concurrency arises because I/O and timers are handled by the host (the browser or Node). When those complete, callbacks are queued for JS to run later.

The event loop picks “what to run next” from queues.

Key actors (mentally visualize):

Call Stack – where JS executes now.

Heap – memory.

Host APIs – timers, DOM, fetch, fs (Node), etc.

Task (macrotask) queue – e.g., setTimeout, I/O callbacks.

Microtask (job) queue – e.g., Promise.then, queueMicrotask, MutationObserver.

Renderer (browser) – layout/paint; loop phases (Node) – timers, poll, check, etc.

Rule of thumb for ordering:

Run one task → drain all microtasks → maybe render (browser) → pick next task.


Task vs Microtask: the card


Bucket

Typical APIs

When they run

Microtasks

Promise.then/catch/finally, queueMicrotask, MutationObserver, process.nextTick (Node, even higher priority)

Immediately after current JS finishes, before the next task, and before rendering

Tasks (macrotasks)

setTimeout, setInterval, DOM events, network & fs callbacks, setImmediate (Node), MessageChannel

Next turn of the event loop



Quick model (browser)

Run current JS (top of stack).





Microtask checkpoint: run every microtask until none remain.

Possibly render a frame (layout/paint).

Take the next task and repeat.

Node.js has phases in a loop (simplified):

Timers → Pending Callbacks → Idle/Prepare → Poll → Check → Close

setTimeout/setInterval fire in Timers.

I/O callbacks are processed in Poll.

setImmediate fires in Check.

process.nextTick runs immediately after the current JS, even before promise microtasks.


Foundational examples (predict the output)

Browser ordering

console.clear(); console.log('A'); setTimeout(() => console.log('B (timeout)'), 0); Promise.resolve().then(() => console.log('C (promise microtask)')); queueMicrotask(() => console.log('D (queueMicrotask)')); console.log('E');

Output: A, E, C (promise microtask), D (queueMicrotask), B (timeout)

Reason: sync first (A, E), then microtasks in the order they were queued (C, D), then the next task (B).

Node ordering: setImmediate vs setTimeout(0)

const fs = require('node:fs'); fs.readFile(__filename, () => { setTimeout(() => console.log('timeout'), 0); setImmediate(() => console.log('immediate')); });

Typical output: immediate then timeout.

Reason: callbacks queued from I/O tend to reach Check (setImmediate) before the next Timers phase. (If you schedule both at top-level without I/O, the order can flip—don’t rely on it.)

Node ordering: process.nextTick vs Promise microtask

Promise.resolve().then(() => console.log('promise microtask')); process.nextTick(() => console.log('nextTick')); console.log('sync');

Output: sync, nextTick, promise microtask.

Reason: nextTick runs before promise microtasks in Node.


Async/Await ≈ sugar for promises (microtasks)

async function getUserName() { const res = await fetch('/api/user'); // pauses function, returns a Promise to caller return (await res.json()).name; }

await splits the function: before await and after await.

The “after” part is scheduled as a microtask once the awaited Promise settles.


Scheduling APIs (when to use which)











setTimeout(fn, ms) / setInterval(fn, ms): schedule tasks. Prefer recursive setTimeout over setInterval to avoid reentrancy.

queueMicrotask(fn): run right after current JS (like Promise.then), useful for “post-state-update” hooks.

requestAnimationFrame(fn) (browser): run before next paint; ideal for animation steps.

requestIdleCallback(fn) (browser): run when the browser is idle; best-effort.

setImmediate(fn) (Node): after I/O poll; good to yield.

process.nextTick(fn) (Node): very soon (even before promises); use sparingly to avoid starving I/O.


Concurrency patterns

1) Parallel I/O with Promise.all

const urls = ['/a.json','/b.json','/c.json']; const [a,b,c] = await Promise.all(urls.map(u => fetch(u).then(r => r.json()))); // All fire “in parallel” (I/O overlaps). If any fails, the whole thing rejects.

2) Don’t fail the whole batch: allSettled

const results = await Promise.allSettled(urls.map(u => fetch(u))); for (const r of results) { if (r.status === 'fulfilled') { /* r.value */ } else { /* r.reason */ } }

3) Race & timeout

function withTimeout(promise, ms) { return Promise.race([ promise, new Promise((_, rej) => setTimeout(() => rej(new Error('Timeout')), ms)) ]); }

4) Limit concurrency (e.g., only 3 at a time)

function pLimit(concurrency) { const queue = []; let active = 0; const next = () => { if (active >= concurrency || queue.length === 0) return; active++; const { fn, resolve, reject } = queue.shift(); (async () => { try { resolve(await fn()); } catch (e) { reject(e); } finally { active--; next(); } })(); }; return (fn) => new Promise((resolve, reject) => { queue.push({ fn, resolve, reject }); queueMicrotask(next); // kick the scheduler soon }); } // Usage const limit = pLimit(3); const tasks = urls.map(u => limit(() => fetch(u).then(r => r.text()))); const pages = await Promise.all(tasks);

5) Cancellation with AbortController

function fetchWithTimeout(resource, { timeout = 8000, signal } = {}) { const controller = new AbortController(); const id = setTimeout(() => controller.abort(), timeout); if (signal) signal.addEventListener('abort', () => controller.abort(), { once: true }); return fetch(resource, { signal: controller.signal }) .finally(() => clearTimeout(id)); } // Usage: const controller = new AbortController(); const promise = fetchWithTimeout('/big.json', { timeout: 3000, signal: controller.signal }); // controller.abort(); // to cancel manually


Avoiding pitfalls

Blocking the event loop: long while loops, JSON on huge payloads, image decoding—these stall UI or server throughput. Offload CPU (see next section).




Microtask starvation: endlessly queueing microtasks prevents tasks/rendering. Yield with setTimeout(fn, 0) / setImmediate / await new Promise(r => setTimeout(r)).

Timer clamping: setTimeout(fn, 0) is not guaranteed immediate; browsers clamp nested timers (commonly to ~4ms; much higher in background tabs).

setInterval drift & reentrancy: an interval can fire again while previous callback still runs. Prefer:

function schedule(fn, ms) { const tick = async () => { await fn(); setTimeout(tick, ms); }; setTimeout(tick, ms); }

Assuming order among different queues: Don’t rely on setTimeout vs setImmediate order across environments.


Offloading CPU-bound work

Browser: Web Worker (separate thread)

worker.js

self.onmessage = (e) => { const n = e.data; function fib(x){ return x <= 1 ? x : fib(x-1) + fib(x-2); } postMessage(fib(n)); };

main.js

const w = new Worker('worker.js'); w.onmessage = (e) => console.log('fib =', e.data); w.postMessage(40); // main thread stays responsive

Node: Worker Threads

worker.mjs

import { parentPort, workerData } from 'node:worker_threads'; function fib(n){ return n <= 1 ? n : fib(n-1) + fib(n-2); } parentPort.postMessage(fib(workerData));

main.mjs

import { Worker } from 'node:worker_threads'; const w = new Worker(new URL('./worker.mjs', import.meta.url), { workerData: 40 }); w.on('message', v => console.log('fib =', v)); w.on('error', console.error);


Rendering-aware scheduling (browser)

Animation: requestAnimationFrame(step) → do DOM reads/writes per frame.









Idle housekeeping: requestIdleCallback(work) → run when UI thread is free (not time-critical).

Yield in long tasks to keep 60fps:

async function chunkedWork(items) { while (items.length) { // do a small batch for (let i = 0; i < 100 && items.length; i++) process(items.pop()); // Yield to render await new Promise(requestAnimationFrame); } }


 (exercises)

Each exercise includes the goal, prompt, and expected result so self-check.

Exercise 1 — Microtask vs Task (browser)

Prompt

<script> console.log('start'); setTimeout(() => console.log('timeout'), 0); Promise.resolve().then(() => console.log('then')); queueMicrotask(() => console.log('qm')); console.log('end'); </script>

Expected: start, end, then, qm, timeout.


Exercise 2 — requestAnimationFrame timing (browser)

Prompt

console.log('t0'); requestAnimationFrame(() => console.log('rAF')); Promise.resolve().then(() => console.log('microtask')); console.log('t1');

Expected: t0, t1, microtask, rAF.

Why: rAF runs before the next paint; microtasks flush before that.


Exercise 3 — setImmediate vs setTimeout (Node)

Prompt

setTimeout(() => console.log('timeout'), 0); setImmediate(() => console.log('immediate'));

Expected: Order is not guaranteed at top level; observe both orders across runs.

Teaching point: Do not rely on this without I/O. See next exercise.


Exercise 4 — I/O context ordering (Node)

Prompt

const fs = require('node:fs'); fs.readFile(__filename, () => { setTimeout(() => console.log('timeout'), 0); setImmediate(() => console.log('immediate')); });

Expected: immediate then timeout (typical in practice).


Exercise 5 — process.nextTick priority (Node)

Prompt

Promise.resolve().then(() => console.log('promise')); process.nextTick(() => console.log('nextTick')); console.log('sync');

Expected: sync, nextTick, promise.


Exercise 6 — Microtask starvation demo (browser or Node)

Prompt

let count = 0; function spin() { if (count++ > 100000) return; // stop eventually Promise.resolve().then(spin); // keep queueing microtasks } spin(); console.log('scheduled'); setTimeout(() => console.log('timeout fired (late)'), 0);

Observation: The timeout is delayed until the microtask storm ends.

Fix: yield periodically:

let i = 0; async function fairSpin() { if (i++ % 1000 === 0) await new Promise(r => setTimeout(r)); // yield queueMicrotask(fairSpin); } fairSpin();


Exercise 7 — Implement a concurrency limiter




Prompt

Use the pLimit implementation above to fetch 20 URLs with concurrency 4; log how many are inflight.

Expected: At most 4 simultaneous network requests; total time ≈ 5× a single request (if all durations are similar).


Exercise 8 — Cancellable fetch with timeout (browser)

Prompt

const controller = new AbortController(); fetchWithTimeout('/slow', { timeout: 2000, signal: controller.signal }) .then(() => console.log('done')) .catch(e => console.log('error:', e.name)); // controller.abort(); // try manual abort too

Expected: Abort after ~2s with AbortError.


Exercise 9 — Web Worker offload (browser)

Compute a large Fibonacci in a worker; confirm the main thread stays responsive (try clicking a button during compute).


Exercise 10 — Async iterators over streams (Node)

Prompt

const fs = require('node:fs'); (async () => { const stream = fs.createReadStream(__filename, { encoding: 'utf8' }); let total = 0; for await (const chunk of stream) { total += chunk.length; } console.log('bytes:', total); })();

Expected: Total byte count printed; illustrates backpressure + event loop friendly iteration.


Patterns you’ll reuse



Debounce (UI friendly)

function debounce(fn, ms = 300) { let t; return (...args) => { clearTimeout(t); t = setTimeout(() => fn(...args), ms); }; }

Throttle (rate limit)

function throttle(fn, ms = 200) { let last = 0, timer; return (...args) => { const now = Date.now(); const remaining = ms - (now - last); if (remaining <= 0) { last = now; fn(...args); } else if (!timer) { timer = setTimeout(() => { last = Date.now(); timer = null; fn(...args); }, remaining); } }; }

Timeout as a Promise (handy with await)

const delay = (ms) => new Promise(r => setTimeout(r, ms));


 Instructor notes 

“Single-threaded but concurrent?” quick poll.


Stack → Microtasks → Tasks → Render; Node phases.

ask to predict output before running.

Async/Await Show how it becomes promises + microtasks.

Patterns Debounce, throttle, Promise.all, limiter.

CPU offload Web Worker/Worker Threads.

Pitfalls starvation, clamping, intervals.

Build a cancellable search box with debounce + AbortController.

          

        ideas

Code reading: find bug where microtasks starve rendering.

Practical: implement concurrency limiter (provide tests).


“Why it works this way” (research-backed context for specialists)

The event loop, tasks, and microtasks are defined by the HTML Living Standard (WHATWG). Browsers conform to this model, with rAF and rendering steps woven around microtask checkpoints.



Node.js builds atop libuv and specifies loop phases (Timers → … → Check). It also adds process.nextTick, which runs before promise microtasks—useful but potentially starvation-prone.

Microtasks exist to let promise continuations and similar callbacks run promptly after the current JS, enabling consistent async composition without frame-to-frame latency.

Performance and responsiveness hinge on not monopolizing a single turn; yielding  and chunking work keeps apps fluid.


Order in browser?

setTimeout(() => console.log(1), 0); Promise.resolve().then(() => console.log(2)); console.log(3);

In Node, which prints first inside an I/O callback: setImmediate or setTimeout(0)?

Why can process.nextTick be dangerous if used in a loop?

When do you choose requestAnimationFrame vs 

requestIdleCallback?

Answers

3, 2, 1

Typically setImmediate (inside I/O).

It can starve the event loop, preventing I/O and timers from running.

rAF for visual updates per frame; idle for non-urgent background work.


Takeaway sheet

Promises & queueMicrotask = microtasks (very soon).


setTimeout/I/O = tasks (next turn).

Node: process.nextTick > Promise microtasks > phases (setImmediate vs setTimeout).

Yield often for responsiveness; offload CPU when needed.

Control concurrency and support cancellation.





No comments:

Post a Comment

“Unlock Hidden Power of Advanced Tables – Responsive Tricks That Shock Developers!”

  Module 38 : Responsive & Advanced Tables.  1. Introduction to Tables in Web Design Tables are used to display structured, relational d...