Recreating Async/Await with Generators and Promises

Async/await is such a neat way to handle asynchronous JavaScript functions, it’s easy to ignore what’s going on under the hood. After completing ‘JavaScript the hard parts’ on FrontEnd masters (which I highly recommend), I was inspired to recreate async await using generators and promises. Before we dive into the recreation, there are a few JS concepts we should revise…

The event loop #

Javascript is a single threaded language. Meaning, we can’t have multiple pieces of code running at the same time. So how are we able to do asynchronous things like setTimeout or fetch data from apis? Well luckily, JS can talk to the browser, which is not single threaded. Doing so allows us to offload asynchronous tasks to the web browser API. But how do we handle the responses it sends back to us?

That’s where the event loop comes in.

Think about the order in which these console.logs will print:

    const timer = () => {
        setTimeout(() => console.log('Hello from timer'), 0)
        console.log("Hello from after the timer")
    }

    console.log("I'm probably going to be first")
    timer()
    console.log("But where do I print?")

Since the setTimeout calls the function in 0ms, it should print immediately right? Wrong.

The order of console logs actually appears like this:

    "I'm probably going to be first"
    "Hello from after the timer"
    "But where do I print?"
    "Hello from timer"

This can all be explained by the way javascript interacts with the browser.

The event loop is in charge of managing the call stack. When the browser api is used, the browser is responsible for keeping track of whether the function is complete. However, if a function is complete, the browser can’t just add it straight back onto the call stack. It must add it to a queue. This is called the macrotask or callback queue. Once all the javascript is finished running, and the call stack is empty, only then will the event loop allow items from the macrotask queue to be added to the callstack. Whenever any browser api task is finished running, such as setTimout, it pushes its callback to this queue.

So, even if we have js code that takes way longer to run than the actual setTimeout. The setTimeout cannot be called until all the js is finished, and the callstack is empty. So a setTimout callback function will never be called before the time specified, but it could be called after a longer delay than what is specified.

Fetch & the promise object #

Fetch has consequences in JavaScript and also interacts with the web browser.

What does fetch do in JS?

In javascript, fetch returns a promise. A Promise is an object that can change in the future. It will always be in one of three states:

onFulfilled is a function on the promise object that is called when the promise resolves. It takes an array of callbacks to execute once fulfilment occurs. Promise.prototype.then() adds callback functions to this onFulfilled array, so that these functions are called only when the promise is resolved.

Promise.prototype.catch() allows us to do a similar thing to .then, but handle the onRejected case instead.

What does fetch do in the browser?

The fetch api talks to the browser to set up the XMLHttpRequest (XHR) which is responsible for fetching the data. This is the asynchronous step. The browser is responsible for marking the request as complete. Once the request is complete, the promise object’s value will be reassigned to whatever was returned from the request.

The microtask queue

The microtask queue is where all promise related tasks are enqueued. When a promise is ready, its then/catch/finally handlers are added to the queue. This is a different queue to the macrotask queue (or callback queue), and it is prioritized before anything in macrotask queue gets dequeued. This means that XHR callbacks will always be handled before callbacks associated with other web apis like setTimeout.

For example, think about what would occur first in this code:

    setTimeout(() => console.log("From timer"), 0)
    
    fetch('https://jsonplaceholder.typicode.com/todos/1')
      .then(response => response.json())
      .then(jsonData => console.log(jsonData))
    
    console.log("Hello")

Because fetching data takes a bit of time, this prints something like:

    'Hello' // console.log from main js file
    'hello from timer' // Set timeout function
    { userId: 1, id : 1, title: "delectus aut autem", completed: false} // Promise response

What if there was something that blocked the main js file for a long time, so that the data had time to return before the JS was finished running?

eg:

    setTimeout(() => console.log("From timer"), 0)
    
    fetch('https://jsonplaceholder.typicode.com/todos/1')
      .then(response => response.json())
      .then(jsonData => console.log(jsonData))
    
    blockForAWhile()
    
    console.log("Hello")

Then the output would look more like this:

    'Hello' // console.log from main js file
    {userId: 1, id: 1, title: "delectus aut autem", completed: false} // Promise response
    'hello from timer' // Set timeout function

Even though setTimeout is still being called first, why is there a different order this time?

It is because of the event loop’s preference for the microtask queue. In the first example, the js was executing long before the promise object had been resolved. So the microtask queue was empty, but the callback queue already had the console.log from setTimeout, which meant it was added to the callstack first. But in the second example, our theoretical blocking function delayed the completion of the js, so much that the XHR had completed by the time the js finished running. Meaning that both the microtask queue and the callback queue both had items and in this scenarios, the microtask queue must be emptied first.

Generators #

A generator function is a function that allows us to pause and re-start the execution of a function. When a generator function is called, rather than executing the function immediately, it returns an iterator object instead.

What is an iterator object?

An iterator is any object that follows the iterator protocol. Ie it has a .next method that returns an object with two values: done and value. Value is the current value, and done is a boolean which is set to true when the iteration is complete. By calling .next repeatedly, you can iterate through a series of values. This is how a for of loop works under the hood.

So since when we call a generator function, it returns one of these iterator objects, we can step through the function by calling .next repeatedly - just as we would with an iterator. But how does it know where to stop and start? That brings us to:

The yield keyword

Yield is similar to the return keyword. It completely pauses a generator function and determines where the function will start again when .next is called.

Imagine yield as a bookmark. It allows you to exit the function without losing your spot, and when you start executing your function again, you begin right where you left off. It is so powerful that you can even bookmark on the right side of variable assignments. When you do this, you can control what the variables are assigned to externally with the .next method. The .next method, not only is your control point for restarting a function, but it actually takes an argument. This argument is then fed back into the function.

So if you’re yielding on a variable assignment then passing an argument into .next when you restart again, you’re causing that variable to be assigned to whatever argument you pass into next.

This is a super powerful way to interactively work with a function. Eg:

    function* addSomethingTo10() {
        const starting = 10
        const inputNum = yield 'What num to add to 10?'
        return 10 + inputNum
    }
    
    const generator = addSomethingTo10()
    const howMuch = generator.next()
    console.log(howMuch)
    const result = generator.next(5)
    console.log(result)

Whatever you pass into generator.next() will dictate what number is added to 10.

Recreating async await with generators #

Generators give us with the ability to enter and exit code as we please. This sounds like something that could be useful if we are waiting for an api to respond!

The async await syntax is so neat it looks almost like magic:

    async function usingAsync() {
        const res = await fetch('https://jsonplaceholder.typicode.com/todos/1')
        const json = await res.json()
        console.log("json from async await: ", json)
    }
    
    usingAsync()

But we can recreate what is going on under the hood by using a combination of generators and promises.

Let’s start with a generator function that looks suspiciously like the async await above:

    function* imitateAsync(){
        const res = yield fetch('https://jsonplaceholder.typicode.com/todos/1')
        const json = yield res.json() 
        console.log("Woohoo we got json: ", json)
    }

What is going on here is pretty close to what is going on above, but instead of the magic await keyword, we are using yield instead.

Like we spoke about before, yield acts as the bookmark of the function allowing us to stop and start it as we need, and those yield statements on the right side of variable declarations let us inject data back into the function. So all we need to do now is handle what data is being injected back in.

Let’s start the function like this:

    const generator = imitateAsync()
    const promise = generator.next() //First yield returns the promise object returned by fetch. 
    promise.value.then(data => doAfterFetchedData(generator, data))

The promise constant here is the result of the first yield, which is the promise object returned by the fetch api. Remember that because you’re using the .next method, it returns this promise wrapped in an iterator object. So you will need to access the value on the iterator to access the promise. Use .then() to add a callback to it’s onFulfilled array.

Now we need to handle what happens when our fetch api promise is resolved:

    function doAfterFetchedData(generator, data){
        const jsonPromise = generator.next(data) // Pass data from fetch back into generator
        jsonPromise.value.then((json) => generator.next(json)) // Once this promise is resolved, pass the json back to the generator
    }

It’s all about handling the promises outside the generator function and allowing the generator to continue only one the promise is resolved.

Altogether, it looks something like this:

    function* imitateAsync(){
        const res = yield fetch('https://jsonplaceholder.typicode.com/todos/1')
        const json = yield res.json() 
        console.log("Woohoo we got json: ", json)
    }
    
    function doAfterFetchedData(generator, data){
        const jsonPromise = generator.next(data)
        jsonPromise.value.then((json) => generator.next(json))
    }
    
    const generator = imitateAsync()
    const promise = generator.next()
    promise.value.then(data => doAfterFetchedData(generator, data))

Try it out on Codepen!

See the Pen Recreating Async/await by Stacey (@quakerface) on CodePen.

Back to top