Woocommerce products batch update limitation: How to update more

I am creating a plugin to update products stock and regular price for now. I am using the batch update API endpoint as follow. I use a csv file with 3 columns product sku, stock and regular price to batch update my website using WC REST API (home_url(‘/wp-json/wc/v3/products/batch’). Everything is working correctly except that I can not batch update when my file contains between 500 to 600 products. I then get 503 Backend fetch failed. The way I proceeed is sending 50 products data at a time as json formatted array of data (as the limit indicated in WC REST API documentation is 100 products). I also used the filter from this SO question to increase the limit but without success. Below are the main methods parts of my stockUpdater class. I first upload the csv file in wp-content folder then use AJAX to instantiate a new class with the methods below that start the batch update process.

    private function sendBatchRequest($data) {

            $ch = curl_init($this->apiUrl);
            curl_setopt_array($ch, [
                CURLOPT_RETURNTRANSFER => true,
                CURLOPT_POST => true,
                CURLOPT_POSTFIELDS => json_encode($data),
                CURLOPT_HTTPHEADER => ['Content-Type: application/json'],
                CURLOPT_USERPWD => $this->apiKey . ':' . $this->apiSecret, 
                CURLOPT_TIMEOUT => 60
            ]);

            $response = curl_exec($ch);
            $httpCode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
            curl_close($ch);

            return [
                'success'   => ($httpCode >= 200 && $httpCode < 300),
                'response'  => json_decode($response, true),
                'http_code' => $httpCode
            ];
    }

    public function processStockFile() {

        $products = $this->parseCSV(); 
        /* parseCSV returns 

            $products[] = [
                'sku' => trim($data[0]),  // Product SKU
                'id'  => $id,    // Product id
                'stock' => !empty($data[1]) ? (int)trim($data[1]) : 0, // Stock quantity
                'price' => !empty($data[2]) ? wc_format_decimal(str_replace(',', '.', trim($data[2]))) : 0//!empty($data[2]) ? (float)$data[2] : 0, // Product price
            ];
        */
        $chunks = array_chunk($products, 50); // split into 50 products per batch

        $results = [];
        foreach ($chunks as $chunk) {
            $data = ['update' => []];
            foreach ($chunk as $product) {
                $data['update'][] = [
                    'id'             => $product['id'],
                    'sku'            => $product['sku'],
                    'stock_quantity' => $product['stock'],
                    'regular_price'  => $product['price'],
                ];
            }

            $results[] = $this->sendBatchRequest($data);
            //used sleep(1) here did not change
        }

        return $results;

    }

I tried to pause the curl request before sending another one (using php sleep() and usleep()) but again without success. Is there anyway to increase this limit (what I tried did not work in my case) or maybe to fraction the data to send and proceed again after the previous response is a success, in this case what is the best way to proceed?