Introduction
Asynchronous programming is the bedrock of modern JavaScript development, enabling non-blocking operations crucial for responsive user interfaces, efficient server-side applications (Node.js), and seamless data handling. From fetching data over a network to processing large files, understanding how JavaScript manages tasks outside the main execution thread is paramount. This chapter dives deep into the core concepts, patterns, and intricacies of asynchronous JavaScript, specifically focusing on Promises, async/await, the Event Loop, and Streams.
Mastery of these topics differentiates a junior developer from a senior or architect. Interviewers use these questions to gauge your understanding of JavaScript’s concurrency model, your ability to write robust, performant, and error-resilient code, and your capacity to debug complex asynchronous flows. This guide will challenge your understanding with tricky questions, real-world scenarios, and code puzzles, ensuring you’re prepared for interviews targeting all levels, from entry-level to architect, as of January 2026.
Core Interview Questions
1. Understanding the Event Loop and Microtask Queue
Q: Explain the JavaScript Event Loop. How do setTimeout(..., 0) and Promise.resolve().then(...) interact with it, specifically regarding the microtask and macrotask queues?
A: The JavaScript Event Loop is a fundamental concurrency model that allows JavaScript to perform non-blocking I/O operations despite being single-threaded. It continuously checks two main queues: the macrotask queue (or task queue) and the microtask queue.
- Call Stack: Where synchronous code executes.
- Web APIs/Node.js APIs: Where asynchronous operations (like
setTimeout,fetch,readFile) are handled by the browser/Node.js environment, not directly by JavaScript. - Macrotask Queue (Task Queue): Holds tasks like
setTimeoutcallbacks,setIntervalcallbacks, I/O operations (e.g., network requests, file system operations), and UI rendering events. - Microtask Queue: Holds tasks like
Promise.then(),Promise.catch(),Promise.finally()callbacks,queueMicrotask(), andMutationObservercallbacks.
The Event Loop works as follows:
- It first executes all synchronous code in the Call Stack.
- When the Call Stack is empty, it then processes all tasks in the Microtask Queue until it’s empty. This is crucial: microtasks have higher priority and are executed before the next macrotask.
- After the Microtask Queue is empty, the Event Loop picks one macrotask from the Macrotask Queue and pushes it onto the Call Stack for execution.
- Once that macrotask completes and the Call Stack is empty, the Event Loop again checks and drains the Microtask Queue.
- This cycle repeats indefinitely.
Interaction of setTimeout(..., 0) and Promise.resolve().then(...):
setTimeout(callback, 0): Even with a 0ms delay, its callback is placed in the Macrotask Queue. It will execute only after the current synchronous code finishes and all microtasks are processed.Promise.resolve().then(callback): Thethencallback is placed in the Microtask Queue. It will execute immediately after the current synchronous code finishes, and before any macrotasks (includingsetTimeout(..., 0)).
Key Points:
- JavaScript is single-threaded, but the Event Loop enables concurrency.
- Microtasks have priority over macrotasks.
- The Event Loop processes all microtasks before moving to the next single macrotask.
Common Mistakes:
- Believing
setTimeout(..., 0)executes immediately after synchronous code. - Confusing the order of microtasks and macrotasks.
- Not understanding that the Event Loop continuously cycles.
Follow-up:
- What happens if a promise’s
thencallback itself returns another promise? - How do
async/awaitfit into the Event Loop model? - Can you write a code snippet to demonstrate the priority of microtasks over macrotasks?
2. Deep Dive into Promises: Promise.allSettled vs. Promise.all
Q: You need to make several API calls concurrently. Describe the differences between Promise.all() and Promise.allSettled() for handling these calls, especially regarding error scenarios. When would you choose one over the other?
A: Both Promise.all() and Promise.allSettled() are static methods on the Promise object that take an iterable (e.g., an array) of Promises and return a single Promise. They are designed for concurrent execution of multiple asynchronous operations.
Promise.all(iterable):- Behavior: Returns a single Promise that resolves when all of the input Promises have resolved. The resolved value is an array of the resolved values from the input Promises, in the same order as the input.
- Error Handling: If any of the input Promises reject,
Promise.all()immediately rejects with the reason of the first Promise that rejected. It “fails fast.” Subsequent Promises might still be running but their results will not be returned. - Use Case: Ideal when you need all operations to succeed to proceed, and failure of any one operation means the entire process should fail. For example, fetching critical data from multiple sources where all pieces are required for display.
Promise.allSettled(iterable)(ES2020 Standard):- Behavior: Returns a single Promise that resolves when all of the input Promises have settled (either resolved or rejected). The resolved value is an array of objects, each describing the outcome of a corresponding Promise in the input array. Each object has a
status(either"fulfilled"or"rejected") and either avalue(if fulfilled) or areason(if rejected). The order is preserved. - Error Handling: It does not short-circuit. It waits for all Promises to complete, regardless of individual success or failure. It never rejects itself; it always resolves with an array describing each Promise’s outcome.
- Use Case: Ideal when you want to know the outcome of all concurrent operations, regardless of whether some fail. This is useful for scenarios like sending multiple analytics events, batch processing where you want to report on all successes and failures, or displaying partial results even if some data fetches fail.
- Behavior: Returns a single Promise that resolves when all of the input Promises have settled (either resolved or rejected). The resolved value is an array of objects, each describing the outcome of a corresponding Promise in the input array. Each object has a
When to choose:
- Choose
Promise.all()when all tasks are critical, and you need to abort if any one fails. - Choose
Promise.allSettled()when you want to execute all tasks to completion and collect all results/errors, even if some fail.
Key Points:
Promise.allfails fast;Promise.allSettledwaits for all.Promise.allSettledprovides detailed status for each promise.Promise.allSettlednever rejects.
Common Mistakes:
- Using
Promise.allwhen you need to know the outcome of all promises, even rejected ones. - Not properly handling the structure of the
Promise.allSettledresult array.
Follow-up:
- How would you implement
Promise.allSettledmanually usingPromise.allandPromise.catch? - What are
Promise.race()andPromise.any()(ES2021) and when would you use them? - How do you ensure proper error handling with
Promise.all?
3. Asynchronous Operations with async/await
Q: Explain how async/await works under the hood. Provide an example where await might lead to performance issues if not used carefully, and how to mitigate it.
A: async/await (introduced in ES2017) is syntactic sugar built on top of Promises, designed to make asynchronous code look and behave more like synchronous code, improving readability and maintainability.
How it works under the hood:
- An
asyncfunction always returns a Promise. If the function returns a non-Promise value, it’s implicitly wrapped inPromise.resolve(). If it throws an error, it implicitly returnsPromise.reject(). - The
awaitkeyword can only be used inside anasyncfunction. Whenawaitis encountered, it pauses the execution of theasyncfunction until the Promise it’s “awaiting” settles (resolves or rejects). - Crucially,
awaitdoes not block the main thread. Instead, when anawaitpauses anasyncfunction, the rest of theasyncfunction’s body (after theawait) is scheduled as a microtask to be executed once the awaited Promise resolves. The Event Loop is then free to process other tasks. - If the awaited Promise resolves, the
asyncfunction resumes execution from where it left off with the resolved value. If it rejects, an error is thrown, which can be caught using atry...catchblock.
Performance Issue Example: Consider fetching data from multiple independent API endpoints:
async function fetchSequentialData() {
console.log("Starting sequential fetches...");
const user = await fetch('/api/user').then(res => res.json()); // Awaits user data
const posts = await fetch(`/api/user/${user.id}/posts`).then(res => res.json()); // Awaits posts data, dependent on user
const comments = await fetch(`/api/posts/${posts[0].id}/comments`).then(res => res.json()); // Awaits comments data, dependent on first post
console.log("Sequential fetches complete.");
return { user, posts, comments };
}
async function fetchPotentiallyProblematicData() {
console.log("Starting problematic parallel fetches...");
// Problematic: These two fetches are independent but awaited sequentially
const data1 = await fetch('/api/data1').then(res => res.json()); // Waits for data1
const data2 = await fetch('/api/data2').then(res => res.json()); // Waits for data2 *after* data1 is complete
console.log("Problematic parallel fetches complete.");
return { data1, data2 };
}
In fetchPotentiallyProblematicData, fetch('/api/data2') only starts after fetch('/api/data1') has fully completed and its Promise has resolved. If data1 and data2 fetches are independent, this sequential await introduces unnecessary latency, making the total execution time roughly the sum of the individual fetch times.
Mitigation:
To run independent asynchronous operations in parallel with async/await, you should initiate all Promises first, then await their results using Promise.all() (or Promise.allSettled() if individual failures are acceptable).
async function fetchParallelData() {
console.log("Starting efficient parallel fetches...");
const promise1 = fetch('/api/data1').then(res => res.json()); // Initiates fetch 1
const promise2 = fetch('/api/data2').then(res => res.json()); // Initiates fetch 2 concurrently
// Await both promises simultaneously using Promise.all
const [data1, data2] = await Promise.all([promise1, promise2]);
console.log("Efficient parallel fetches complete.");
return { data1, data2 };
}
In fetchParallelData, both fetch operations start almost simultaneously. Promise.all then waits for both to complete. The total execution time will be closer to the longest individual fetch time, rather than the sum.
Key Points:
async/awaitis syntactic sugar over Promises.awaitpauses theasyncfunction, but not the main thread.- Execution after
awaitis scheduled as a microtask. - Use
Promise.all(orrace/any/allSettled) withawaitto achieve parallelism.
Common Mistakes:
- Using
awaiton independent operations sequentially, leading to performance bottlenecks. - Forgetting that
asyncfunctions always return a Promise. - Not wrapping
awaitcalls intry...catchfor error handling withinasyncfunctions.
Follow-up:
- How do you handle errors in
async/await? - Can you use
awaitat the top level of a module (outside anasyncfunction)? (As of ES2022, yes, with “Top-Level Await”). - Discuss the difference in error handling between
Promise.allandPromise.allSettledwhen used withasync/await.
4. Tricky Event Loop Scenario
Q: What will be the output of the following code snippet? Explain your reasoning in detail, focusing on the Event Loop’s behavior.
console.log('Start');
setTimeout(() => {
console.log('setTimeout 1');
Promise.resolve().then(() => {
console.log('Promise 3 (from setTimeout)');
});
}, 0);
Promise.resolve().then(() => {
console.log('Promise 1');
}).then(() => {
console.log('Promise 2');
});
setTimeout(() => {
console.log('setTimeout 2');
}, 0);
console.log('End');
A: The output will be:
Start
End
Promise 1
Promise 2
setTimeout 1
Promise 3 (from setTimeout)
setTimeout 2
Reasoning:
Synchronous Code Execution:
console.log('Start');executes first.- The
setTimeout(() => { ... }, 0);call is encountered. Its callback is placed in the Macrotask Queue. - The
Promise.resolve().then(() => { ... });call is encountered. The first.then()callback (console.log('Promise 1')) is placed in the Microtask Queue. - The second
setTimeout(() => { ... }, 0);call is encountered. Its callback is placed in the Macrotask Queue (after the firstsetTimeoutcallback). console.log('End');executes.- At this point, the Call Stack is empty.
Draining the Microtask Queue:
- The Event Loop checks the Microtask Queue. It finds
console.log('Promise 1'). - This microtask executes, printing
'Promise 1'. - Crucially, the
.then(() => { console.log('Promise 2'); });is chained to the first promise. When the firstthenresolves, the secondthen’s callback (console.log('Promise 2')) is added to the Microtask Queue. - The Event Loop continues to drain the Microtask Queue. It finds
console.log('Promise 2'). - This microtask executes, printing
'Promise 2'. - Now the Microtask Queue is empty.
- The Event Loop checks the Microtask Queue. It finds
Processing Macrotasks:
- The Event Loop checks the Macrotask Queue. It picks the first macrotask: the callback from the first
setTimeout. - This macrotask executes:
console.log('setTimeout 1');prints'setTimeout 1'.Promise.resolve().then(() => { console.log('Promise 3 (from setTimeout)'); });is encountered. Its callback (console.log('Promise 3 (from setTimeout)')) is placed in the Microtask Queue.
- The current macrotask finishes. The Call Stack is empty.
- The Event Loop checks the Macrotask Queue. It picks the first macrotask: the callback from the first
Draining Microtask Queue (again):
- Before picking the next macrotask, the Event Loop again drains the Microtask Queue. It finds
console.log('Promise 3 (from setTimeout)'). - This microtask executes, printing
'Promise 3 (from setTimeout)'. - Now the Microtask Queue is empty.
- Before picking the next macrotask, the Event Loop again drains the Microtask Queue. It finds
Processing Next Macrotask:
- The Event Loop checks the Macrotask Queue. It picks the next macrotask: the callback from the second
setTimeout. - This macrotask executes, printing
'setTimeout 2'. - The Macrotask Queue is now empty. The cycle continues, but no more tasks are pending.
- The Event Loop checks the Macrotask Queue. It picks the next macrotask: the callback from the second
Key Points:
- Synchronous code runs first.
- Microtasks are prioritized and fully drained after each macrotask and after the initial synchronous execution.
- Chained
.then()calls schedule subsequent callbacks as new microtasks.
Common Mistakes:
- Assuming
setTimeoutcallbacks always run before promise callbacks if their delays are 0. - Not realizing that microtasks can be added and processed during a macrotask’s execution (if that macrotask itself schedules new microtasks).
Follow-up:
- What if a
queueMicrotask()call was added? Where would it fit? - How would this output change if
setTimeouthad a delay of100ms?
5. Memory Management in Asynchronous Contexts
Q: Discuss potential memory leaks in asynchronous JavaScript, especially concerning closures and long-running operations. How can you prevent them?
A: Memory leaks in JavaScript occur when objects are no longer needed but are still referenced, preventing the garbage collector from reclaiming their memory. Asynchronous operations, especially when combined with closures, can inadvertently create these persistent references.
Common Scenarios for Async Memory Leaks:
Event Listeners on Elements that are Removed:
- If you attach an event listener (e.g.,
click,scroll) to a DOM element and then remove that element from the DOM without removing the listener, the callback function (and its closure scope) might still hold a reference to the element, preventing its garbage collection. - Prevention: Always remove event listeners using
removeEventListenerwhen the element or component is unmounted/destroyed. UseAbortControllerfor easier cleanup of multiple listeners orfetchrequests.
- If you attach an event listener (e.g.,
Closures Capturing Large Scopes:
- A common pattern in async JS is callbacks that form closures, capturing variables from their outer scope. If a callback is scheduled for a long-running or indefinite period (e.g.,
setInterval, a WebSocket listener, an unresolved Promise chain) and it captures a large object or a reference to a parent component instance, that object/component might never be garbage collected, even if it’s logically “out of use.” - Prevention:
- Be mindful of captured variables: Only capture what’s strictly necessary in closures.
- Nullify references: Explicitly set references to
nullwhen they are no longer needed within the closure, especially for large objects. - Clear timers/listeners: Ensure
clearInterval,clearTimeout,socket.close(), or similar cleanup functions are called when the component or scope that set them up is destroyed.
- A common pattern in async JS is callbacks that form closures, capturing variables from their outer scope. If a callback is scheduled for a long-running or indefinite period (e.g.,
Unresolved Promises/Async Operations:
- If you kick off an
fetchrequest or any Promise-based async operation, but the Promise never settles (e.g., network issues, server timeout, or simply not handling thecatchblock correctly for a very long time), the Promise object itself and its associated state (including any closures it holds) will remain in memory. While not a “leak” in the traditional sense if it eventually resolves/rejects, a large number of such perpetually pending operations can consume significant memory. - Prevention:
- Timeouts: Implement timeouts for network requests using
AbortControlleror custom Promise race logic to ensure operations eventually settle. - Error Handling: Always include
catchblocks ortry...catchwithasync/awaitto handle rejections and prevent unhandled promise rejections (which can also signal memory issues if not addressed).
- Timeouts: Implement timeouts for network requests using
- If you kick off an
Global Variables/Caches:
- Asynchronous operations might populate global caches or variables. If these caches grow indefinitely without a clear eviction strategy or are not cleared when their data becomes stale, they can lead to memory bloat.
- Prevention: Implement clear caching strategies (LRU, time-based expiration) and ensure global references are managed.
General Prevention Strategies:
- Weak References (e.g.,
WeakMap,WeakSet): These allow you to store objects without preventing them from being garbage collected if no other strong references exist. Useful for caching or metadata associated with objects. - Module-level cleanup: For library or framework components, provide explicit
destroy()orcleanup()methods. - Profiling Tools: Use browser developer tools (Memory tab, Performance tab) to identify and analyze memory usage patterns and pinpoint leaks.
Key Points:
- Memory leaks in async JS often stem from persistent references within closures or uncleaned resources.
- Event listeners, long-running timers, and unresolved promises are common culprits.
- Proactive cleanup (removing listeners, clearing timers, handling promise rejections) is essential.
Common Mistakes:
- Forgetting to
removeEventListener. - Not implementing timeouts for network requests.
- Allowing closures to capture unnecessarily large scopes.
Follow-up:
- How can
AbortControllerhelp with memory management infetchrequests? - When would you use
WeakMaporWeakSetin an asynchronous application? - Describe a scenario where a
setIntervalcould cause a memory leak and how to fix it.
6. JavaScript Streams (Readable/Writable)
Q: What are JavaScript Streams, and why are they important in modern web development and Node.js? Provide a basic example of using a ReadableStream.
A: JavaScript Streams (specifically ReadableStream, WritableStream, TransformStream as per the Web Streams API and similar concepts in Node.js) are interfaces for handling data in chunks, rather than waiting for the entire data to be available. This allows for efficient, non-blocking processing of large amounts of data, improving performance and memory usage.
Why they are important:
- Efficiency: Process data as it arrives, reducing memory footprint by not loading entire files/responses into memory at once. This is crucial for large files (e.g., video, large JSON, CSV) or continuous data feeds.
- Performance: Start processing earlier, leading to faster perceived load times and responsiveness, especially in network operations.
- Backpressure Handling: Streams can manage the flow of data between a source and a destination, preventing the producer from overwhelming the consumer (and vice-versa).
- Composability: Streams are highly composable. You can pipe the output of one stream directly into the input of another, creating complex data pipelines with ease.
- Universal API: The Web Streams API is designed to be consistent across various environments (browsers, Service Workers, Node.js via polyfills or native implementations).
Basic Example of using a ReadableStream (Web Streams API):
Imagine fetching a large text file. Instead of waiting for the entire file to download, we can read and process it chunk by chunk:
async function processLargeTextFile(url) {
try {
const response = await fetch(url);
if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}
const reader = response.body.getReader(); // Get a ReadableStreamDefaultReader
let result;
let receivedChunks = [];
console.log('Starting to read stream...');
// Read until the stream is done
while (!(result = await reader.read()).done) {
const chunk = result.value; // chunk is a Uint8Array
receivedChunks.push(chunk);
console.log(`Received chunk of ${chunk.length} bytes. Total received: ${receivedChunks.reduce((acc, c) => acc + c.length, 0)} bytes`);
// In a real application, you might process the chunk here, e.g., display partial data, save to disk
}
console.log('Stream finished. All chunks received.');
const totalContent = new TextDecoder().decode(
receivedChunks.reduce((acc, chunk) => {
const temp = new Uint8Array(acc.length + chunk.length);
temp.set(acc);
temp.set(chunk, acc.length);
return temp;
}, new Uint8Array(0))
);
// console.log('Full content:', totalContent.substring(0, 100) + '...'); // Log first 100 chars
return totalContent;
} catch (error) {
console.error('Error processing stream:', error);
}
}
// Example usage (assuming '/large-text-file.txt' exists and serves text)
// processLargeTextFile('/large-text-file.txt');
In this example, response.body is a ReadableStream. We get a reader from it and then repeatedly call reader.read() to get chunks of data. The while loop continues until result.done is true, indicating the end of the stream. Each chunk is a Uint8Array.
Key Points:
- Streams process data in chunks, not all at once.
- Crucial for memory and performance with large datasets.
ReadableStream,WritableStream,TransformStreamare core types.- Web Streams API is becoming a standard for browser and Node.js environments.
Common Mistakes:
- Trying to access
response.text()orresponse.json()after getting areader, as the stream can only be consumed once. - Not handling potential errors during stream processing.
Follow-up:
- What is backpressure in the context of streams and how is it handled?
- When would you use a
TransformStream? - How do Node.js streams differ from Web Streams, and what are common patterns like
pipe()?
7. Event Loop and UI Rendering
Q: How does the Event Loop interact with browser UI rendering? Consider a scenario where a long-running synchronous script blocks the UI. How can you mitigate this using asynchronous patterns?
A: The browser’s Event Loop is responsible for more than just JavaScript execution; it also handles UI rendering, user input events, network events, and more. UI rendering is typically scheduled as a macrotask.
Interaction: After the JavaScript engine’s Call Stack is empty and the Microtask Queue is drained, the Event Loop checks the Macrotask Queue. Among the macrotasks, there’s often a “rendering task” that the browser needs to perform (e.g., updating the layout, painting pixels, handling animations). If the Call Stack is continuously busy with JavaScript execution, the Event Loop cannot pick up this rendering task, leading to a frozen UI.
Scenario: Long-running Synchronous Script Blocking UI:
function performHeavyCalculation() {
let result = 0;
for (let i = 0; i < 10_000_000_000; i++) { // 10 billion iterations
result += Math.sqrt(i);
}
return result;
}
document.getElementById('startBtn').addEventListener('click', () => {
document.getElementById('status').textContent = 'Calculating...';
const startTime = Date.now();
const calculationResult = performHeavyCalculation(); // This blocks the main thread
const endTime = Date.now();
document.getElementById('status').textContent = `Calculation complete: ${calculationResult} in ${endTime - startTime}ms`;
});
In this scenario, when startBtn is clicked, the performHeavyCalculation() function runs synchronously. While it’s running, the JavaScript engine is fully occupied. The browser cannot update the status text to “Calculating…” until performHeavyCalculation() finishes, because the UI rendering task cannot be picked from the Macrotask Queue. The UI will appear frozen, unresponsive to clicks, and no visual updates will occur until the loop completes.
Mitigation using Asynchronous Patterns:
setTimeout(..., 0)to yield to the Event Loop: Break up the long-running task into smaller chunks and schedule them withsetTimeout. This allows the Event Loop to process other tasks (like UI rendering) between chunks.function performHeavyCalculationAsync(iterations, callback) { let result = 0; let i = 0; const chunkSize = 1_000_000; // Process 1 million iterations at a time function processChunk() { const start = i; const end = Math.min(i + chunkSize, iterations); for (; i < end; i++) { result += Math.sqrt(i); } if (i < iterations) { // Yield to the Event Loop, allowing UI updates setTimeout(processChunk, 0); } else { callback(result); } } processChunk(); } document.getElementById('startBtn').addEventListener('click', () => { document.getElementById('status').textContent = 'Calculating...'; const startTime = Date.now(); performHeavyCalculationAsync(10_000_000_000, (calculationResult) => { const endTime = Date.now(); document.getElementById('status').textContent = `Calculation complete: ${calculationResult} in ${endTime - startTime}ms`; }); });This version updates the UI to “Calculating…” almost immediately and allows the browser to remain responsive during the calculation.
Web Workers(for truly CPU-intensive tasks): For very heavy computations, the best approach is to offload them to a Web Worker. Web Workers run in a separate thread, completely isolated from the main UI thread. They communicate with the main thread via message passing.// worker.js onmessage = function(e) { const iterations = e.data; let result = 0; for (let i = 0; i < iterations; i++) { result += Math.sqrt(i); } postMessage(result); }; // main.js document.getElementById('startBtn').addEventListener('click', () => { document.getElementById('status').textContent = 'Calculating via Web Worker...'; const startTime = Date.now(); const worker = new Worker('worker.js'); worker.postMessage(10_000_000_000); // Send data to worker worker.onmessage = function(e) { const calculationResult = e.data; const endTime = Date.now(); document.getElementById('status').textContent = `Calculation complete: ${calculationResult} in ${endTime - startTime}ms`; worker.terminate(); // Clean up the worker }; worker.onerror = function(error) { console.error('Worker error:', error); document.getElementById('status').textContent = 'Calculation failed!'; }; });Web Workers are the preferred solution for heavy computations as they ensure the main thread remains fully responsive.
Key Points:
- UI rendering is a macrotask.
- Long synchronous JS blocks the Event Loop, preventing UI updates.
- Use
setTimeout(..., 0)to break up tasks and yield to the Event Loop. - Use
Web Workersfor truly CPU-intensive computations to keep the main thread free.
Common Mistakes:
- Forgetting that even
setTimeout(..., 0)still adds to the macrotask queue, and if the chunks are too large, it can still cause jank. - Trying to directly manipulate the DOM from a Web Worker (which is not allowed).
Follow-up:
- What are the limitations of Web Workers?
- When would you choose
requestAnimationFramefor asynchronous UI updates versussetTimeout? - How can
IntersectionObserverorResizeObservercontribute to non-blocking UI interactions?
8. async/await Error Handling and Pitfalls
Q: How do you properly handle errors in async/await code? Illustrate a common pitfall where errors might be missed and how to correct it.
A: Error handling in async/await is primarily done using try...catch blocks, similar to synchronous code. An await expression will “unwrap” a rejected Promise into a thrown exception, which can then be caught.
Basic Error Handling:
async function fetchData(url) {
try {
const response = await fetch(url);
if (!response.ok) {
throw new Error(`HTTP error! Status: ${response.status}`);
}
const data = await response.json();
return data;
} catch (error) {
console.error("Failed to fetch data:", error.message);
// Re-throw or return a default value, depending on requirements
throw error;
}
}
// Usage
fetchData('/api/data').then(data => {
console.log('Data received:', data);
}).catch(err => {
console.error('Caught error outside async function:', err.message);
});
Common Pitfall: Unhandled Promise Rejections in Parallel await with Promise.all
When using Promise.all with async/await to run multiple promises in parallel, if one of the promises rejects, Promise.all will immediately reject. If this rejection is not caught, it can lead to an unhandled promise rejection.
async function fetchMultipleResourcesProblematic() {
try {
const promise1 = fetch('/api/resource1').then(res => res.json());
const promise2 = Promise.reject(new Error('Resource 2 failed intentionally')); // This promise will reject
const promise3 = fetch('/api/resource3').then(res => res.json());
// Pitfall: If promise2 rejects, Promise.all will reject, and the catch block below
// might not be sufficient if you want to know individual failures and still process successes.
const [data1, data2, data3] = await Promise.all([promise1, promise2, promise3]);
console.log('All data:', { data1, data2, data3 });
return { data1, data2, data3 };
} catch (error) {
console.error('Caught error in fetchMultipleResourcesProblematic:', error.message);
// This only catches the *first* rejection from Promise.all
throw error;
}
}
// Calling the problematic function:
fetchMultipleResourcesProblematic()
.catch(err => console.error('Outer catch for problematic:', err.message));
// Output: "Caught error in fetchMultipleResourcesProblematic: Resource 2 failed intentionally"
The pitfall here is that Promise.all “fails fast.” If promise2 rejects, the catch block in fetchMultipleResourcesProblematic will catch that specific rejection. However, if promise1 and promise3 also eventually resolve or reject, their outcomes are lost. If the goal was to get all outcomes, even failures, Promise.all is the wrong tool.
Correction: Using Promise.allSettled for Independent Error Handling
To get the outcome of all promises, regardless of their success or failure, Promise.allSettled is the correct choice. It never rejects itself, always resolving with an array of status objects.
async function fetchMultipleResourcesCorrect() {
const promise1 = fetch('/api/resource1').then(res => res.json()).catch(err => ({ status: 'rejected', reason: err }));
const promise2 = Promise.reject(new Error('Resource 2 failed intentionally')).catch(err => ({ status: 'rejected', reason: err }));
const promise3 = fetch('/api/resource3').then(res => res.json()).catch(err => ({ status: 'rejected', reason: err }));
// Use Promise.allSettled to get outcomes of all promises
const results = await Promise.allSettled([promise1, promise2, promise3]);
console.log('All results:', results);
const successfulData = results
.filter(result => result.status === 'fulfilled')
.map(result => result.value);
const failedReasons = results
.filter(result => result.status === 'rejected')
.map(result => result.reason.message); // Access reason.message if it's an Error object
if (failedReasons.length > 0) {
console.warn('Some resources failed:', failedReasons);
}
return { successfulData, failedReasons };
}
// Calling the corrected function:
fetchMultipleResourcesCorrect()
.then(data => console.log('Processed results:', data))
.catch(err => console.error('Outer catch for corrected (should not happen if allSettled is used correctly):', err.message));
In this corrected version, Promise.allSettled ensures that all promises run to completion, and we get a detailed report of each outcome. The outer catch block for fetchMultipleResourcesCorrect would only be triggered if Promise.allSettled itself somehow failed (which it won’t), or if an error occurred outside the Promise.allSettled block. Individual promise rejections are handled by the .catch() attached to each individual promise before they are passed to Promise.allSettled, or more commonly, Promise.allSettled’s result is then iterated over to check for 'rejected' statuses.
Key Points:
- Use
try...catchfor error handling withinasyncfunctions. awaitunwraps rejected Promises into thrown exceptions.Promise.all“fails fast”; usePromise.allSettledwhen you need all outcomes.- For individual error handling in
Promise.allSettled, ensure individual promises handle their own rejections (e.g., by returning a specific error object in their.catch).
Common Mistakes:
- Not wrapping
awaitcalls intry...catchblocks. - Using
Promise.allwhenPromise.allSettledis more appropriate for scenarios where partial success is acceptable. - Not understanding how
Promise.all’s “fail-fast” behavior impacts error reporting.
Follow-up:
- How would you handle an
asyncfunction that doesn’t explicitlyreturn Promise.reject()or throw an error, but implicitly rejects? - Discuss global unhandled promise rejection handlers (
unhandledrejectionevent). - Can you mix
async/awaitwith traditional.then()/.catch()? When would you?
9. Deferred Resolution with Promise Constructor Anti-Pattern
Q: You are asked to wrap an old callback-based API with a Promise. Consider the following implementation. Identify the anti-pattern and explain why it’s problematic. How would you refactor it?
// Old callback-based API
function oldApiCall(data, callback) {
setTimeout(() => {
if (data === 'error') {
callback(new Error('Simulated API error'));
} else {
callback(null, `Processed: ${data}`);
}
}, 100);
}
// Promise wrapper (Problematic)
function problematicPromiseWrapper(data) {
let resultPromise = new Promise((resolve, reject) => {
oldApiCall(data, (error, response) => {
if (error) {
reject(error);
} else {
resolve(response);
}
});
});
return resultPromise; // This is the anti-pattern
}
A: The problematic anti-pattern here is creating a “deferred” Promise using new Promise() where it’s not strictly necessary, specifically when the resolve and reject functions are exposed or used outside the immediate new Promise constructor callback. While the provided example doesn’t fully expose them, the structure hints at a common mistake where a new Promise is created and then its resolve/reject functions are passed around or stored, rather than being used immediately within the constructor.
Why it’s problematic (the “Deferred Anti-Pattern” / “Explicit Promise Construction Anti-Pattern”):
The core issue is that new Promise() is meant for promisifying callback-based APIs or creating a Promise for an operation that starts immediately within the constructor’s executor function. When you create a Promise and then decide when to resolve/reject it (often by storing resolve/reject in external variables), you’re essentially recreating the “deferred” pattern that Promises were designed to replace.
- Loss of Immutability: Promises are designed to be “settled once.” Exposing
resolve/rejectmakes it possible to accidentally call them multiple times or from unexpected places, violating the Promise contract. - Harder to Debug: It obscures the flow of control and makes it harder to trace when and where a Promise will settle.
- Unnecessary Complexity: It adds boilerplate and increases the chance of bugs.
- Not Truly Asynchronous Immediately: If
oldApiCallwere synchronous, theresolveorrejectcould be called immediately, butresultPromisewouldn’t be returned until the next tick, which can be confusing.
In the provided snippet, the anti-pattern is subtle: problematicPromiseWrapper immediately calls oldApiCall within the new Promise constructor, which is technically okay for promisification. The “anti-pattern” label applies more strongly if resolve and reject were stored outside and called later, or if oldApiCall itself wasn’t truly asynchronous. However, even in this form, it’s generally considered better to use a dedicated promisification utility or async/await for clarity.
How to Refactor it:
The correct way to promisify a callback-based API (where the callback follows the (err, data) Node.js convention) is to use new Promise() in a way that the resolve and reject are called directly and immediately within the constructor’s executor function. The provided example does this correctly, so the “anti-pattern” description above might be slightly misapplied to the exact code, but the spirit of avoiding deferred promises is key.
However, if we want to be even more idiomatic and concise, especially with modern JavaScript, we can use an async function if the intent is to integrate this into an async/await flow, or simply keep the new Promise usage as shown but be aware of the pattern.
Refactored Version (most common and idiomatic way to promisify):
// Old callback-based API (remains the same)
function oldApiCall(data, callback) {
setTimeout(() => {
if (data === 'error') {
callback(new Error('Simulated API error'));
} else {
callback(null, `Processed: ${data}`);
}
}, 100);
}
// Correct Promise wrapper
function correctPromiseWrapper(data) {
return new Promise((resolve, reject) => {
oldApiCall(data, (error, response) => {
if (error) {
reject(error);
} else {
resolve(response);
}
});
});
}
// Usage with .then/.catch
correctPromiseWrapper('hello')
.then(result => console.log('Correct wrapper success:', result))
.catch(error => console.error('Correct wrapper error:', error.message));
correctPromiseWrapper('error')
.then(result => console.log('Correct wrapper success (should not happen):', result))
.catch(error => console.error('Correct wrapper error:', error.message));
// Usage with async/await
async function useCorrectWrapper() {
try {
const result1 = await correctPromiseWrapper('world');
console.log('Async/await success:', result1);
const result2 = await correctPromiseWrapper('error'); // This will throw
console.log('Async/await success (should not happen):', result2);
} catch (error) {
console.error('Async/await caught error:', error.message);
}
}
useCorrectWrapper();
The refactored version is essentially the same as the “problematic” one, highlighting that for simple promisification of callback APIs, this new Promise structure is indeed the correct and widely accepted pattern. The “anti-pattern” warning is more for situations where resolve and reject are passed out of the new Promise constructor’s scope, leading to external control over the Promise’s state. The provided code does not do that, so it’s a valid promisification.
Perhaps a better example of the anti-pattern would be:
// True deferred anti-pattern
function trulyProblematicDeferred() {
let res, rej;
const p = new Promise((resolve, reject) => {
res = resolve; // Storing resolve/reject outside
rej = reject;
});
// Now res and rej can be called from anywhere, anytime, breaking Promise principles
setTimeout(() => res('Delayed success'), 100);
return p;
}
const myDeferredPromise = trulyProblematicDeferred();
myDeferredPromise.then(val => console.log('Deferred success:', val));
// This is problematic because `res` and `rej` are now globally accessible or captured in a wider scope.
The original question’s “problematic” label on return resultPromise; is misleading if the promise is indeed constructed and settled within the function’s scope. The idiomatic way to promisify a callback function is exactly as shown. The anti-pattern is specifically about deferring the promise construction or resolution/rejection outside the immediate new Promise executor’s scope.
Key Points:
new Promise()is for promisifying callback APIs or operations that start immediately.- The “deferred anti-pattern” involves exposing
resolve/rejectoutside thenew Promiseconstructor. - Avoid creating Promises whose resolution or rejection is controlled externally to the constructor’s executor.
Common Mistakes:
- Misunderstanding the role of
new Promise()and using it whenPromise.resolve()orPromise.reject()would suffice. - Accidentally creating a deferred pattern by passing
resolve/rejectoutside the executor.
Follow-up:
- When is it appropriate to use
new Promise()? - How does
util.promisify(in Node.js) help with this task? - Can you describe another scenario where
new Promise()might be misused?
MCQ Section
1. Event Loop Priority
**Q: Consider the following code:
console.log('A');
setTimeout(() => console.log('B'), 0);
Promise.resolve().then(() => console.log('C'));
console.log('D');
What is the correct output order?**
A. A, B, C, D B. A, D, B, C C. A, D, C, B D. A, C, D, B
Correct Answer: C
Explanation:
console.log('A')andconsole.log('D')are synchronous and execute immediately.setTimeoutcallback goes to the Macrotask Queue.Promise.resolve().then()callback goes to the Microtask Queue.- The Event Loop prioritizes the Microtask Queue over the Macrotask Queue after synchronous code completes. So, ‘C’ runs before ‘B’.
2. async/await Parallelism
Q: You have two independent async operations, op1() and op2(), that each return a Promise. You want to execute them in parallel and wait for both to complete. Which of the following is the most efficient way using async/await?
A.
async function execute() {
const result1 = await op1();
const result2 = await op2();
return [result1, result2];
}
B.
async function execute() {
const promise1 = op1();
const promise2 = op2();
const result1 = await promise1;
const result2 = await promise2;
return [result1, result2];
}
C.
async function execute() {
const [result1, result2] = await Promise.all([op1(), op2()]);
return [result1, result2];
}
D.
async function execute() {
return Promise.all([await op1(), await op2()]);
}
Correct Answer: C
Explanation:
- A is sequential:
op2()only starts afterop1()completes. - B is also sequential in effect:
await promise1pauses execution untilpromise1resolves, thenawait promise2pauses untilpromise2resolves. Whileop1()andop2()are initiated concurrently, theawaitstatements still force sequential waiting. - C correctly initiates both
op1()andop2()concurrently and then usesPromise.all()to wait for both to settle efficiently. - D is incorrect because
await op1()andawait op2()would make thePromise.allcall wait for each promise sequentially before even forming the array, defeating the purpose of parallelism.
3. Promise.allSettled Behavior
**Q: Given the following array of promises:
const promises = [
Promise.resolve(1),
Promise.reject('Error occurred'),
new Promise(resolve => setTimeout(() => resolve(3), 50))
];
What will be the result of await Promise.allSettled(promises)?**
A. The promise will reject with 'Error occurred'.
B. [ { status: 'fulfilled', value: 1 }, { status: 'rejected', reason: 'Error occurred' }, { status: 'fulfilled', value: 3 } ]
C. [ 1, undefined, 3 ]
D. The promise will resolve with [1, 'Error occurred', 3]
Correct Answer: B
Explanation:
Promise.allSettled()waits for all promises to settle (either fulfilled or rejected) and never rejects itself.- It returns an array of objects, each describing the status and value/reason for the corresponding promise.
- The order of results matches the order of input promises.
4. this Binding in async/await
Q: Inside an async function, how does this behave?
A. this is always bound to the global object (window in browser, undefined in strict mode).
B. this is lexically scoped, similar to arrow functions, and captures this from the surrounding context.
C. this is dynamically bound based on how the async function is called.
D. this is always undefined within async functions.
Correct Answer: C
Explanation:
asyncfunctions are regular functions (or methods), not arrow functions. Therefore,thisinside anasyncfunction behaves likethisin any other regular function: its value is determined by how the function is called.- If called as a method (
obj.myAsyncMethod()),thisrefers toobj. - If called as a standalone function (
myAsyncFunction()),thisrefers to the global object (orundefinedin strict mode). - If called with
call(),apply(), orbind(),thisis explicitly set. - Arrow functions lexically bind
this, meaning they capturethisfrom their enclosing scope, butasyncfunctions do not inherently do this.
5. Microtasks within Macrotasks
**Q: What will be the output of the following code?
console.log('1');
setTimeout(() => {
console.log('2');
Promise.resolve().then(() => console.log('3'));
}, 0);
Promise.resolve().then(() => console.log('4'));
console.log('5');
```**
A. 1, 2, 3, 4, 5
B. 1, 5, 2, 4, 3
C. 1, 5, 4, 2, 3
D. 1, 4, 5, 2, 3
**Correct Answer: C**
**Explanation:**
1. **Synchronous:** `console.log('1')` and `console.log('5')` execute.
2. **Queuing:** `setTimeout` callback goes to Macrotask Queue. `Promise.resolve().then(() => console.log('4'))` callback goes to Microtask Queue.
3. **Microtask Drain 1:** Call Stack is empty. Event Loop drains Microtask Queue. `console.log('4')` executes.
4. **Macrotask Execution 1:** Microtask Queue is empty. Event Loop picks the first macrotask (`setTimeout` callback). `console.log('2')` executes.
5. **Microtask Queuing within Macrotask:** Inside the `setTimeout` callback, `Promise.resolve().then(() => console.log('3'))` is encountered. Its callback goes to the Microtask Queue.
6. **Macrotask Completion:** The `setTimeout` callback finishes. Call Stack is empty.
7. **Microtask Drain 2:** Event Loop *again* drains the Microtask Queue. `console.log('3')` executes.
8. **Next Macrotask:** (None left in this example).
Output: `1`, `5`, `4`, `2`, `3`
## Mock Interview Scenario: Building a Real-time Dashboard
**Scenario Setup:**
You are interviewing for a Senior Frontend Engineer role. The interviewer presents a scenario:
"We need to build a real-time analytics dashboard. This dashboard will display various metrics, some of which are fetched from REST APIs, and others are streamed in real-time via a WebSocket connection. The dashboard should remain responsive even when data is heavy or network conditions are poor. You need to design the data fetching and processing architecture."
**Interviewer Questions (Sequential Flow):**
**Interviewer:** "Okay, let's start with the basics. How would you initially fetch the static dashboard configuration and initial data points from a REST API (`/api/config`, `/api/initial-metrics`) when the dashboard loads? Assume these two fetches are independent."
**Candidate (Expected Answer Structure):**
"I would use `async`/`await` combined with `Promise.all` to fetch both resources concurrently. This ensures the dashboard doesn't wait for one fetch to complete before starting the other, optimizing load time. I'd also include `try...catch` for error handling."
```javascript
async function loadDashboardData() {
try {
const [configResponse, metricsResponse] = await Promise.all([
fetch('/api/config'),
fetch('/api/initial-metrics')
]);
if (!configResponse.ok || !metricsResponse.ok) {
throw new Error('Failed to fetch initial dashboard data.');
}
const config = await configResponse.json();
const metrics = await metricsResponse.json();
console.log('Dashboard Config:', config);
console.log('Initial Metrics:', metrics);
return { config, metrics };
} catch (error) {
console.error('Error loading dashboard:', error);
// Display a user-friendly error message or fallback UI
throw error; // Re-throw to propagate the error if needed
}
}
// In your main app component:
// loadDashboardData().then(data => { /* render dashboard */ }).catch(err => { /* show error screen */ });
Interviewer: “Good. Now, some metrics come from a WebSocket (wss://api.example.com/metrics). How would you set up a listener for real-time updates, and critically, how would you ensure the dashboard remains responsive while processing a high volume of incoming messages?”
Candidate (Expected Answer Structure):
“I’d use the WebSocket API. To ensure responsiveness, I’d process incoming messages asynchronously and potentially batch updates, or debounce/throttle UI renders if the update rate is extremely high. For very heavy processing per message, I’d consider a Web Worker.”
let ws;
const metricUpdates = []; // Buffer for incoming metrics
let isProcessing = false;
function setupRealtimeMetrics() {
ws = new WebSocket('wss://api.example.com/metrics');
ws.onopen = () => console.log('WebSocket connected for metrics.');
ws.onmessage = (event) => {
metricUpdates.push(JSON.parse(event.data));
// Trigger processing, but don't block the main thread
if (!isProcessing) {
processMetricBuffer();
}
};
ws.onerror = (error) => console.error('WebSocket error:', error);
ws.onclose = () => console.log('WebSocket disconnected for metrics.');
}
async function processMetricBuffer() {
isProcessing = true;
while (metricUpdates.length > 0) {
const metric = metricUpdates.shift(); // Get one metric from the buffer
// Simulate heavy processing (e.g., complex chart update logic)
await new Promise(resolve => setTimeout(resolve, 5)); // Yield to event loop
console.log('Processed real-time metric:', metric);
// Update UI here (e.g., update a chart, a specific metric display)
// Consider using requestAnimationFrame for UI updates if visual
// or a debounced/throttled update function if data-driven.
}
isProcessing = false;
}
// Don't forget to call this to start:
// setupRealtimeMetrics();
Interviewer: “Excellent. What if a user navigates away from the dashboard? How would you ensure all network requests and WebSocket connections are properly cleaned up to prevent memory leaks or unnecessary background activity?”
Candidate (Expected Answer Structure):
“For cleanup, I’d implement a lifecycle method (e.g., componentWillUnmount in React, disconnectedCallback in Web Components, or simply an explicit destroy() function).
- WebSocket: Call
ws.close()to terminate the connection. - Pending
fetchrequests: UseAbortController. When the component unmounts, I’d callabortController.abort()to cancel any in-flightfetchrequests. - Timers/Intervals: Clear any
setTimeoutorsetIntervalusingclearTimeout/clearInterval. - Event Listeners: Remove any DOM event listeners added manually using
removeEventListener.”
class DashboardComponent {
constructor() {
this.abortController = new AbortController();
this.signal = this.abortController.signal;
this.ws = null;
this.intervalId = null;
}
async init() {
// Fetch initial data, passing the signal for aborting
try {
const [configResponse, metricsResponse] = await Promise.all([
fetch('/api/config', { signal: this.signal }),
fetch('/api/initial-metrics', { signal: this.signal })
]);
// ... process responses
} catch (error) {
if (error.name === 'AbortError') {
console.log('Fetch aborted.');
} else {
console.error('Error fetching initial data:', error);
}
}
// Setup WebSocket
this.ws = new WebSocket('wss://api.example.com/metrics');
this.ws.onmessage = (event) => { /* process */ };
// ... other ws handlers
// Example: a recurring data poll
this.intervalId = setInterval(() => {
// fetch('/api/live-data', { signal: this.signal })
// ...
}, 5000);
}
destroy() {
console.log('Cleaning up DashboardComponent...');
// 1. Abort any pending fetch requests
this.abortController.abort();
// 2. Close WebSocket connection
if (this.ws && this.ws.readyState === WebSocket.OPEN) {
this.ws.close();
}
// 3. Clear any intervals/timeouts
if (this.intervalId) {
clearInterval(this.intervalId);
}
// 4. (If any manual DOM listeners were added) remove them:
// document.removeEventListener('scroll', this.handleScroll);
}
}
// Usage:
// const dashboard = new DashboardComponent();
// dashboard.init();
// // Later, when navigating away or unmounting:
// dashboard.destroy();
Red Flags to Avoid:
- Blocking UI: Suggesting synchronous processing for high-volume data streams.
- Ignoring Error Handling: Not mentioning
try...catchor.catch()for Promises. - No Cleanup: Forgetting to close WebSockets or cancel pending requests.
- Misunderstanding
Promise.allvs.Promise.allSettled: UsingPromise.allwhen partial success is acceptable and all results are needed. - Confusing Event Loop concepts: Incorrectly explaining microtask/macrotask priority.
Practical Tips
- Master the Event Loop: This is the absolute foundation. Use interactive tools (e.g.,
loupeby Philip Roberts, or browser dev tools performance tab) to visualize how synchronous code, microtasks, and macrotasks interact. Practice predicting outputs of complex snippets. - Understand Promise States and Methods: Know
pending,fulfilled,rejected. Be proficient withPromise.all,race,any,allSettled, and their specific use cases and error handling behaviors. async/awaitis Syntactic Sugar: Remember it’s built on Promises. This understanding helps in debugging and identifying performance pitfalls (like sequentialawaitfor independent operations).- Practice Promisification: Be able to convert callback-based APIs into Promise-based ones using
new Promise(). - Error Handling is Crucial: Always think about
try...catchwithasync/awaitand.catch()with Promises. Understand unhandled promise rejections and how to deal with them (e.g., globalunhandledrejectionevent). - Consider Web Workers: For truly CPU-intensive tasks, know when and how to leverage Web Workers to keep the main thread free and the UI responsive.
- Resource Management: Think about cleanup. How do you close WebSockets, cancel
fetchrequests, clear timers, and remove event listeners when a component unmounts or a task is no longer needed?AbortControlleris your friend forfetchand other cancellable APIs. - Stay Current: As of 2026-01-14, ensure you’re familiar with features like
Promise.any(ES2021) and Top-Level Await (ES2022) if applicable to your target environment.
Summary
Asynchronous JavaScript is a cornerstone of modern web development, enabling responsive applications and efficient resource utilization. This chapter has equipped you with a deep understanding of Promises, async/await, the Event Loop’s intricate mechanics (microtasks vs. macrotasks), and resource management strategies. From predicting code outputs in tricky scenarios to designing robust, real-time data architectures, the ability to reason about and implement asynchronous patterns is a hallmark of an expert JavaScript developer. Continuous practice, especially with edge cases and performance considerations, will solidify your expertise.
Next Steps:
- Implement the mock interview scenario code yourself.
- Experiment with
AbortControllerinfetchand other cancellable APIs. - Try building a simple Web Worker to offload a heavy computation.
- Challenge yourself with more complex Event Loop puzzles found on platforms like JSConf EU talks or specialized blogs.
References Block:
- MDN Web Docs - Concurrency model and the Event Loop: https://developer.mozilla.org/en-US/docs/Web/JavaScript/EventLoop
- MDN Web Docs - Using Promises: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise
- MDN Web Docs - async function: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function
- MDN Web Docs - Web Workers API: https://developer.mozilla.org/en-US/docs/Web/API/Web_Workers_API
- MDN Web Docs - Web Streams API: https://developer.mozilla.org/en-US/docs/Web/API/Streams_API
- JavaScript Visualizer (Loupe by Philip Roberts): http://latentflip.com/loupe/ (Excellent for understanding the Event Loop visually)
This interview preparation guide is AI-assisted and reviewed. It references official documentation and recognized interview preparation resources.