In nodejs, is there a way to dynamically control the number of concurrent tasks based on available memory?

I am using a package called @supercharge/promise-pool in NodeJS to do some concurrency tasks:

import { PromisePool } from '@supercharge/promise-pool';

... other code ...


    await PromisePool.for(res[folder].filePaths)
      .withConcurrency(10)
      .process(async (item: any) => {
        const { fileFullPath: fileFullPath, name: name }: any = item;
        try {
          // TODO:  If there are 10 files larger than 200MB parallelling at same time, the memory usage will be over 2GB;
          const filedata = await fs.promises.readFile(fileFullPath);
          const byteArray = new Uint8Array(filedata);
          const dicomDataSet = dicomParser.parseDicom(byteArray);

          // ... do some business logic

        } catch (error) {
          logger.warn(error);
        }
      });

Now, the concurrency number is fixed at 10. And this step:

const filedata = await fs.promises.readFile(fileFullPath);

will load the data from disk into memory.

If there are 10 files larger than 200MB being processed concurrently, the memory used at peak will be over 2GB in the worst case scenario.

I am deploying the above code in a server node with only 2GB memory. This will crash the node.

In nodejs, is there a way to dynamically control the number of concurrent tasks based on available memory?

For example, incrementally adding more concurrent tasks when the memory available is more than the size of the file that will be read ?

Or are there any other packages that is capable of it?