Handling a large number of promises in JavaScript

avatar
(Edited)

In my bot for Splinterlands, I came across a problem on how to deal with a large number of node-fetch requests. It is usually around 50, which when done parallelly, return an array of battles, which I combine later on. For a low bandwidth network, doing 50 requests would become difficult and the overall operation gets canceled. So, I scoured the net to find a way to do them sequentially. It took a long time, but the work got done. The following blog is how I used both of them to get my work done efficiently.

For an example

Lets have a function which will be a proxy to the node fetch.

const fetchProxy=()=> new Promise((resolve, reject) => {
  const randomWait = 3000*Math.random();
  setTimeout(() => resolve(randomWait), randomWait);
});

Every async fetch will be a Promise object. Let's say we have an array, on whose elements we need to perform the fetchProxy operation.

const arrayToWork = [...Array(50).keys()];

Parallelly

To resolve them parallelly, we can use map and forEach array methods. It will call the callback function at the same time.

startTime=Date.now(); // Starting time
// Will display the time taken to resolve the array of promises
Promise.all(arrayToWork.map(fetchProxy)).then((values) => {console.log(Date.now()-startTime)}); // always less than 3000 as given in the fetchProxy function

Sequentially

And to resolve them sequentially, We need to use Array.prototype.reduce method, because the reduce method will call the callback method one after the other since it needs to accumulate the result of the current call for the next item. In our case, the accumulator is a memo which makes sure that it accumulates (and returns) a promise in each call.

Promise.resolve(
  arrayToWork.reduce((memo,i)=>memo.then(fetchProxy),Promise.resolve())
).then(()=>console.log(Date.now()-startTime))// will display time taken to resolve sequentially
// Will be less than 50*3000, i.e. 150,000 ms

what If we mix them both?

first, we need to split the array into chunks of smaller arrays. chunk will do it for you.
We deal with the chunks sequentially, and every chunk parallelly. Better of both worlds.

const chunk = (arr, n) => {
  if(n<=0)throw new Error('First argument to splitEvery must be a positive integer')
  var result = [],idx = 0;
  while(idx<arr.length)result.push(arr.slice(idx,idx+=n))
  return result
}

const chunkSize = 25; // number that reflects your bandwidth's capacity

Promise.resolve(chunk(arrayToWork,chunkSize).reduce((memo,pieceOfChunk)=>
  memo.then(()=>Promise.all(pieceOfChunk.map(fetchProxy))),Promise.resolve()
)).then(()=>console.log(Date.now()-startTime))

Conclusion

Now it won't matter if the number of calls is 50 or 1500. it will always chunk up the array, and complete sequentially. Take a look at the file battles-data.js in my repo

Check out my other posts while creating Splinterlands

The above example to check all the methods:

const chunk = (arr, n) => {
  if(n<=0)throw new Error('First argument to splitEvery must be a positive integer')
  var result = [],idx = 0;
  while(idx<arr.length)result.push(arr.slice(idx,idx+=n))
  return result
}
const fetchProxy=()=> new Promise((resolve, reject) => {
  const randomWait = 300*Math.random();
  setTimeout(() => resolve(randomWait), randomWait);
});
const reqCount = 50;
const arrayToWork = [...Array(reqCount).keys()];
console.log(startTime=Date.now())


Promise.all(arrayToWork.map(fetchProxy)).then(() => console.log({'Parallel ~ 300ms':Date.now()-startTime}));
Promise.resolve(
  arrayToWork.reduce((memo,i)=>memo.then(fetchProxy),Promise.resolve())
).then(()=>console.log({[`Sequential ~ ${reqCount}*300ms`]:Date.now()-startTime}))

const chunkSize = 25; // number that reflects your bandwidth's capacity
Promise.resolve(chunk(arrayToWork,chunkSize).reduce((memo,pieceOfChunk)=>
  memo.then(()=>Promise.all(pieceOfChunk.map(fetchProxy))),Promise.resolve()
)).then(()=>console.log({[`Parallel and Sequential ~ < ${reqCount}/${chunkSize} * 300ms`]:Date.now()-startTime}))


0
0
0.000
4 comments