Blank page with no error message

I am getting a blank page for a script that pulls a large number of records from the database. I’m 99% sure the problem is not enough memory - if I increase the memory_limit then the page will work.

However, I don’t know why no error is being printed and I am getting a blank page when the memory_limit is set at default. The page is just loading up blank straight away, it is not taking any time before loading up blank.

In my php.ini I have:

log_errors = off
;error_log = /home/djeyewater/logs/php-errors.log
display_errors = on
display_startup_errors = on
...
error_reporting  =  E_STRICT|E_ALL

In my server config I have

fastcgi_intercept_errors on;

So fatal PHP exceptions are normally output to the server logs, and other errors to the screen.

Depending on the LIMIT I set in the query, for a smaller number of records I get a 501 error in the browser and the following in the server log:

2011/11/05 14:26:58 [error] 980#0: *182 recv() failed (104: Connection reset by peer) while reading response header from upstream, client: 127.0.0.1, server: www.photosite.com, request: "GET /generate-dummy-records.xhtml HTTP/1.0", upstream: "fastcgi://unix:/home/djeyewater/apps/php/var/run/php.sock:", host: "www.photosite.com"

For a larger LIMIT (or no LIMIT) I get a blank page and no error in the server log.

The page itself is below:

<?php
echo 'Hello';
include('globalFuncs.inc.php');

$imageData = array();
$result = $conn->query('SELECT * FROM imageData LIMIT 5000');
while($row = $result->fetch_assoc()){
    $imageData[] = $row;
}
$result->close();
if($conn->errno){
    echo $conn->error;
}

I have also tried adding

ini_set('display_errors',1);
error_reporting(E_ALL);

to the top of the script, but this didn’t make any difference.

Does anyone know why I’m not seeing an out of memory error but just getting a blank page instead?

My PHP version is 5.3.4.

Thanks

Dave

Perhaps you’re also getting a segmentation fault? You’ll need to check the server’s logs for that (the actual server, not the webserver).

They tend to generate white pages of doom.

Thanks for the suggestion, I checked /var/log/messages and /var/log/syslog, but no errors there.

Hmm…

Try setting error_reporting(-1); although I doubt it’ll make much difference. (grasping at straws here)

Could you install xdebug? That might give more info.

Well #1 response: You’re pulling too much data at once. Why do you need that much data?

Setting error_reporting(-1) didn’t seem to do anything.

I installed xdebug and this is the stack trace xdebug gives for the page:

TRACE START [2011-11-07 14:38:37]
    0.0020     414688  +414688   -> {main}() /home/djeyewater/webapps/htdocs/photosite/generate-dummy-records.php:0
    0.0026     414744      +56     -> set_time_limit(600) /home/djeyewater/webapps/htdocs/photosite/generate-dummy-records.php:6
                                   >=> TRUE
    0.0028     414824      +80     -> error_reporting(-1) /home/djeyewater/webapps/htdocs/photosite/generate-dummy-records.php:7
                                   >=> 32767
    0.0111     450460   +35636     -> include(/home/djeyewater/SSI/photosite/globalFuncs.inc.php) /home/djeyewater/webapps/htdocs/photosite/generate-dummy-records.php:9
    0.0112     450592     +132       -> define('HOME_DIR', '/home/djeyewater') /home/djeyewater/SSI/photosite/globalFuncs.inc.php:12
                                     >=> TRUE
    0.0113     450612      +20       -> define('DOMAIN', 'photosite.com') /home/djeyewater/SSI/photosite/globalFuncs.inc.php:14
                                     >=> TRUE
    0.0115     450660      +48       -> define('STATIC1', 'http://static1.photosite.com') /home/djeyewater/SSI/photosite/globalFuncs.inc.php:16
                                     >=> TRUE
    0.0116     450692      +32       -> define('STATIC2', 'http://static2.photosite.com') /home/djeyewater/SSI/photosite/globalFuncs.inc.php:18
                                     >=> TRUE
    0.0117     450728      +36       -> define('WWW', 'http://www.photosite.com') /home/djeyewater/SSI/photosite/globalFuncs.inc.php:20
                                     >=> TRUE
    0.0121     452208    +1480       -> mysqli->mysqli('localhost', 'blah', 'de', 'blah') /home/djeyewater/SSI/photosite/globalFuncs.inc.php:30
                                     >=> NULL
    0.0249     457740    +5532       -> mysqli_connect_errno() /home/djeyewater/SSI/photosite/globalFuncs.inc.php:31
                                     >=> 0
    0.0251     457784      +44       -> mysqli->set_charset('utf8') /home/djeyewater/SSI/photosite/globalFuncs.inc.php:34
                                     >=> TRUE
    0.0253     457740      -44       -> session_start() /home/djeyewater/SSI/photosite/globalFuncs.inc.php:47
                                     >=> TRUE
    0.0255     448148    -9592     -> mysqli->query('SELECT * FROM imageData') /home/djeyewater/webapps/htdocs/photosite/generate-dummy-records.php:13
    0.1359     559384
TRACE END   [2011-11-07 14:38:37]

That was with only 1MB memory_limit. If I increase to 16MB, it pulls some of the records off the db before dying:

...
    0.0120     448148    -9592     -> mysqli->query('SELECT * FROM imageData') /home/djeyewater/webapps/htdocs/photosite/generate-dummy-records.php:13
                                   >=> class mysqli_result { public $current_field = NULL; public $field_count = NULL; public $lengths = NULL; public $num_rows = NULL; public $type = NULL }

... some fetch_assoc calls and results here

    1.4520   16641844     +160     -> mysqli_result->fetch_assoc() /home/djeyewater/webapps/htdocs/photosite/generate-dummy-records.php:14
    1.5108     254496
TRACE END   [2011-11-07 14:59:17]

It seems like it gets so far, runs out of memory, and then exits, without giving an out of memory error. With 256MB memory the script runs correctly.

The reason I am pulling so many records is that it has been recommended I fill the database with dummy records to do some testing. I figured the best way to do this was to take the existing records and then mix and match the different values to create new records.

The query mentioned in the thread you linked is limited to 30 records. Again, why do you need 5000?

In that thread it was recommended that my database should have at least 2m records. Personally, I think it is unlikely that having 2m records will have much effect on the results I was getting, but I won’t know if I don’t try. I am pulling all the current records from the main table of the database to mix up the different values from the existing records to create 2m new records.

However, we need to be careful not to get too off-topic in this thread - the question is not about how to pull a lot of records without running out of memory. But rather, why is PHP not giving an error when it does run out of memory?

Well, the solution to the ‘problem’ is not to run out of memory.
I can crash my webpage by requesting massive amounts of data, sure… but if i’m never going to do that in a real application of the data model… what exactly have I proven?

As far as your error not showing up, configure your server to log errors and see if it catches the error, if there is one. You may also need to check apache’s logs for a cause (timeout?)

The server usually has no error message in its logs, though depending on the LIMIT set in the query or the memory_limit value, it might generate a 501 error, as per my first post.

I usually use the following, which puts php exceptions along with a stack trace into the error log:

function exception_handler($exception) {
    $msg = 'PHP Uncaught exception: '.$exception."\
";
    error_log($msg, 4);
    header("HTTP/1.0 500 Internal Server Error", true, 500);
    echo '500 Internal Server Error';
}
set_exception_handler('exception_handler');

But it doesn’t log anything for this out of memory error.

You can see my php.ini error directives in my first post. I’m not sure if there’s any error directives I’m missing or have configured incorrectly that are causing the out of memory error to be hidden?