How to optimize CPU usage and memory consumption when generating large CSV files using Node.js and Sequelize?

I’m developing a Node.js application that generates CSV reports from data fetched using Sequelize from a PostgreSQL database. While the solution works well for smaller datasets, CPU usage spikes to 100% and memory consumption increases significantly when generating CSV files containing more than 100k records.

Here’s a simplified version of the function responsible for creating the CSV report:

exports.createReport= (fileName, filePath, response) => {
    return new Promise((resolve, reject) => {
        const ws = fs.createWriteStream(filePath);

        const totalCount = response.count;

        ws.on('error', (error) => {
            reject(error);
        });

        response.rows.forEach(row => {
            let temp1 = JSON.stringify(row.dataValues.offer_details);
            row.dataValues.offer_details = tempOfferDetails;

            let temp2 = JSON.stringify(row.dataValues.allowances);
            row.dataValues.allowances = tempAllowances;

            let temp3 = JSON.stringify(row.dataValues.failure_reason);
            row.dataValues.failure_reason = tempFailureReason;
        });

        const csvStream = fastcsv
            .format({ headers: true })
            .on('end', () => {
                console.log("CSV writing complete");
                resolve({
                    'fileName': fileName,
                    'filePath': filePath,
                    'totalRows': totalCount
                });
            });

        response.rows.forEach(row => {
            csvStream.write(row.dataValues);
        });

        csvStream.pipe(ws);
    });
}

However, this approach becomes inefficient for larger datasets, causing CPU usage to hit 100% and memory consumption to increase significantly. What optimizations can I implement to improve the performance of CSV generation, especially for datasets exceeding 100k records?

Any suggestions/solutions/approaches would be greatly appreciated. Thank you!

the above works fine on locally even records reaches a million but am using a microservice architecture and php is getting the file from Nodejs . That also works fine for records under 100k. Nodejs app is deployed on K PODS and using console and logging i have seen that when we get records more than 100k then our sequlize query get success but after that it wont console anything in file function and does not throw error but CPU suddenly becomes 100%

I have also checked nginx timeout and also file size limitation they are all ok