Sequential promise chain and promise chunking in javascript

Sequential promise chain and promise chunking in javascript
Photo by He Junhui / Unsplash

While dealing with a group of promises, we usually want to run them in parallel to improve performance. Promise.all is our friend in this case. We group our promises in an array and pass it into Promise.all

const promises = [promise1, promise2, promise3];
Promise.all(promises).then(data => {
  console.log(data); // [result1, result2, result3]
})

But sometimes, a situation may arise when we can't run the promises concurrently even if they belong to the same group. For example, if one promise depends on the result of its previous promise, we must do our async operations sequentially one by one. Let's see how.

💡
Prerequisite: Javascript Array.prototype.reduce and Promise

sequential promise with for loop

If you prefer simplicity or want more control, use a for loop to create the sequence

async function executeSequentially(promises) {
    const results = [];
    for (const promise of promises) {
        const result = await promise(); // Await each promise
        results.push(result); // Collect result when promise resolves
    }
    return results;
}

const promises = [promise1, promise2, promise3];
executeSequentially(promises).then(results => {
  console.log(results); // [result1, ressult2, result3]
})

sequential promise with Array.prototype.reduce

If you like the functional approach you can use array.reduce. It is more declarative and has a built-in accumulator that gets passed to each iteration with fresh value from previous callback. So if you are in a situation where one promise depends on the result of previous promise, this pattern array.reduce may feel more intuitive.

async function executeSequentially(promises) {
  
  const reducer = async (prevPromise, currPromise) => {
    const prevData = await prevPromise;
    return currPromise.then(result => prevData.concat(result));
  }
  // we start with a resolved promise returning an accumulator array
  const results = await promises.reduce(reducer, Promise.resolve([]));

  return results;
}

const promises = [promise1, promise2, promise3];
executeSequentially(promises).then(results => {
  console.log(results); // [result1, ressult2, result3]
})

Promise chunking with array.reduce

So far we've dealt with one single promise at a time. We waited for the previous one to finish before starting next one. Let's try a bit different situation. Suppose we have 10 posts that we want to write into database. That database has a rate limit of 3. We want to use Promise.all to initiate all the request at once but that may cause a rate limit error. So we need to build an optimal solution that tackles both issues - use Promise.all for performance gain without causing rate limit error.

To achieve this, we will divide our 10 promises into chunks or batches of 3, then run those chunks one by one. We'll use Promise.all for each single batch.

async function executeSequentially(promiseChunks) {
  
  const reducer = async (prevPromiseBatch, currPromiseBatch) => {
    const prevDatas = await prevPromiseBatch;
    return Promise.all(currPromiseBatch)
      .then(result => prevDatas.concat(result));
  }
  
  const results = await promiseChunks.reduce(reducer, Promise.resolve([]));

  return results;
}

const promiseChunks = [[promise1, promise2, promise3],[promise4, promise5, promise6], [promise7, promise8, promise9], [promise10]];

executeSequentially(promiseChunks).then(results => {
  console.log(results); // [result1, ressult2, result3, result4, result5, result6, result7, result8, result9, result10]
})

You can use for...of loop too but I personally prefer the array.reduce method