Why does exporting 92k rows of data using PHPSpreadsheet not leak memory, even with unlimited execution time and memory? [closed]

I’m working on a data export functionality in Laravel and need to export a large dataset, approximately 92,000 rows. I’m using PHPSpreadsheet for the export, and I’ve explicitly set the following PHP configurations to avoid any limitations during the export process:

  • max_execution_time = -1 (no execution time limit)

  • memory_limit = 0 (no memory limit)

Despite this, I’ve noticed that the memory usage does not increase significantly during the export, and there’s no memory leak, which is unexpected given the large dataset.

I expected that exporting such a large number of rows without using streaming would lead to high memory usage or even cause memory leaks, but that doesn’t seem to be the case.

Here’s what I’ve tried so far:

  • I’m using PHPSpreadsheet for the export process.
  • I have set max_execution_time = -1 and memory_limit = 0 to remove any limits.

I haven’t implemented chunking or streaming in the process, and I’m not using the stream concept for exporting.

Can anyone explain why this is happening? Is there something in the way PHPSpreadsheet or Laravel handles large exports that prevents memory leaks in this scenario, or am I missing something?