bsteo February 2016

PDO Sqlite `fetchAll()` exhausting all memory

I have the following PHP script that use PDO to connect to my Sqlite3 database and count all records:

<?php

ini_set('max_execution_time', 300);
error_reporting(E_ALL);
ini_set("display_errors", 1);

$db = new PDO('sqlite:db/MYDB.sl3');
$result = $db->query('SELECT * FROM MYTABLE');
$rows = $result->fetchAll();
$row_count = count($rows);

echo "Rows: " . $row_count . "\n";

?>

all well if I run it from shell (PHP-CLI):

Rows: 175412

but running it from web it throws me the following error:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 14 bytes) in /var/www/test/sqlitepdo.php on line 9

Setting the memory in PHP is not a solution because this should not even happen, because it doesn't happen with MySQL same results as example.

Any idea?

Answers


Hanky Panky February 2016

So you are creating an array with 175K rows, just for the sake of counting the number of rows. That is bound to create memory issues. Even if you manage to resolve that its pointless to do that.

For an insight, 134217728 bytes ~ 134 MB. So you're reading more than 134 MB worth of data per call of that query which in itself is an issue even if you manage to suppress the error.

You can simply change your query to ask sqllite what's the row count.

SELECT count(*) as c FROM MYTABLE

Then you have only 1 row with the row count, use that.

$row = $result->fetch(PDO::FETCH_ASSOC);
echo "Rows: " . $row["c"] . "\n";

This can help answer why is it not generating an error in CLI mode

Post Status

Asked in February 2016
Viewed 3,246 times
Voted 4
Answered 1 times

Search




Leave an answer