开发者

PHP (or MySQL) crash when retrieving a big database record (~5MB)

开发者 https://www.devze.com 2023-03-16 18:28 出处:网络
It doesnt di开发者_Python百科splay any errors, just a blank page. I tried die(\'test\') before I call my function to retrieve a record and it does it, but when I place die(\'test\') after my retrieve

It doesnt di开发者_Python百科splay any errors, just a blank page. I tried die('test') before I call my function to retrieve a record and it does it, but when I place die('test') after my retrieve row function all I get is an empty page (on chrome it says this: Error 324 (net::ERR_EMPTY_RESPONSE): The server closed the connection without sending any data.)

.. I have tried (with 128M, -1, 64M, etc)

ini_set('memory_limit', -1);

With no luck. I am using mysqli to retrieve the record and a simple query like 'Select data from tblBackup' (only 1 record in the database)

Any help?

Thanks in advance

Update: I tailed the apache error log and I get this when trying to load the page,

[Thu Jun 30 13:47:37 2011] [notice] child pid 25405 exit signal Segmentation fault (11)


Checkout php.ini variables for execution time. Sounds like PHP might be timing out.

max_execution_time =3000
max_input_time = 6000

Also, you may have this done at the .ini level, but you can add this to get the PHP error. Put these at the top of your file:

error_reporting(E_ALL);
ini_set('display_errors', '1');


What's the client and server settings for max_allowed_packet? If it's smaller than than the ~5MB blob you're trying to send across, then the connection will be killed.


Well, been there man. But at the end of the day: Is that a good way, to handle the query? If you have many results, then LIMIT them down.. maybe with pagination?

Because yes, you can set the limits higher, but in a longer run.. is it really the most effective way?


While compiling php, my --with-pdo-mysql=[DIR] flag seemed to cause the issue. I removed the [DIR] and left it blank. The problem is gone.

0

精彩评论

暂无评论...
验证码 换一张
取 消