nphp12 Jul 2025 01:42

Profiling a script that hits 512 MB and gets killed. Sharing what I learned about PHP memory tracking.

Measuring allocations:

PHP
<?php
function memDelta(callable $fn): array
{
gc_collect_cycles();
$before = memory_get_usage();
$fn();
gc_collect_cycles();
$after = memory_get_usage();
$peak = memory_get_peak_usage(true);
return [
'delta' => $after - $before,
'peak' => $peak,
];
}
// Compare array vs SplFixedArray
$result = memDelta(function () {
$a = range(1, 100000);
unset($a);
});
echo 'Regular array delta: ' . round($result['delta'] / 1024, 1) . " KB\n";
$result2 = memDelta(function () {
$a = new SplFixedArray(100000);
unset($a);
});
echo 'SplFixedArray delta: ' . round($result2['delta'] / 1024, 1) . " KB\n";
echo 'Peak: ' . round($result2['peak'] / 1024 / 1024, 2) . " MB\n";
הההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההה
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

array_chunk for batch processing:

PHP
<?php
$ids = range(1, 50000);
$batchSize = 1000;
$processed = 0;
foreach (array_chunk($ids, $batchSize) as $batch) {
// simulate processing
$sum = array_sum($batch);
$processed += count($batch);
// force GC between batches
unset($sum);
gc_collect_cycles();
}
echo "Processed: {$processed}\n";
echo "Peak: " . round(memory_get_peak_usage() / 1024 / 1024, 2) . " MB\n";
הההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההההה
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
Replies (7)
alex_petrov12 Jul 2025 01:49

SplFixedArray uses about 5x less memory than a regular array for integer-indexed data because it avoids the hash table overhead. Worth knowing for large in-memory datasets.

0
dmitry_kv12 Jul 2025 02:30

The array_chunk approach is good but loading all IDs into memory first defeats the purpose for very large sets. Better to chunk the IDs at query level: SELECT id FROM table LIMIT 1000 OFFSET n.

0
vova12 Jul 2025 03:34

gc_collect_cycles() is expensive. Calling it every batch is usually unnecessary. PHP GC runs automatically. Call it only when you have evidence of circular reference accumulation.

0
lukaszkrzyz12 Jul 2025 04:17

hm, but how do you even find WHICH variable is the problem? memory_get_usage just gives a number, not where it went

0
nphp12 Jul 2025 05:10

Xdebug memory profiler or Blackfire shows per-function allocation. For a quick manual approach: add memory_get_usage() logging at key points in the loop and look for which iteration the growth happens.

0
alex_petrov12 Jul 2025 06:39

Doctrine EntityManager identity map is a common source of gradual growth. Call $em->clear() periodically during bulk operations or use $em->detach($entity) after processing each row.

0
fredbauer12 Jul 2025 07:08

In Eloquent: using cursor() on a query returns a LazyCollection that yields one model at a time. Much better than get() which hydrates all results at once.

0