开发者_Go百科I've got a PHP script that I call to run MySQL database backups to .sql files, TAR/GZip them and e-mail them to me. One of the database is hosted by a different provider than the one providing the web server. Everything is hosted on Linux/Unix. When I run this command:
$results = exec("mysqldump -h $dbhost -u $dbuser -p$dbpass $dbname > $backupfile", $output, $retval);
(FYI, I've also tried this with system(), passthru() and shell_exec().)
My browser loads the page for 15-20 seconds and then stops without processing. When I look at the server with an FTP client, I can see the resulting file show up a few seconds later and then the file size builds until the database is backed up. So, the backup file is created but the script stops working before the file can be compressed and sent to me.
I've checked the the max_execution_time
variable in PHP and it's set to 30 seconds (longer than it takes for the page to stop working) and have set the set_time_limit
value to as much as 200 seconds.
Anyone have any idea what's going on here?
Are you on shared hosting or are these your own servers? If the former your hosting provider may have set the max execution time to 15-20secs and set it so it cannot be overridden (I have this problem with 1&1 and these type of scripts).
Re-check the execution-time-related parameters with a phpinfo() call... maybe it's all about what Paolo writes.
Could also be a (reverse) proxy that is giving up after a certain period of inactivity. Granted it's a long shot but anyway.... try
// test A
$start = time();
sleep(20);
$stop = time();
echo $start, ' ', $stop;
and
// test B
for($i=0; $i<20; $i++) {
sleep(1);
echo time(), "\n";
}
If the first one times out and the second doesn't I'd call that not proof but evidence.
Maybe the provider has set another resource limit beyond the php.ini setting. Try
<?php passthru('ulimit -a');
If the command is available it should print a list of resources and their limits, e.g.
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 4095
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 4095
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited
Maybe you find some more restrictive settings than that on your shared server.
- Do a manual dump and diff it against the broken one. This may tell you at which point mysqldump stops/crashes
- Consider logging mysqldump output, as in mysqldump ... 2>/tmp/dump.log
- Consider executing mysqldump detached so that control is returned to PHP before the dump is finished
On a side note, it is almost always a good idea to mysqldump -Q
精彩评论