开发者

Multi-server file storage and sharing causes website to hang - other websites on same server remain accessible

开发者 https://www.devze.com 2023-04-09 15:29 出处:网络
I\'m trying to make a system where files are stored on Main Servers, and if a user chooses, these files can be copied over to Backup Servers to prevent data loss at a HDD crash but it\'s not going as

I'm trying to make a system where files are stored on Main Servers, and if a user chooses, these files can be copied over to Backup Servers to prevent data loss at a HDD crash but it's not going as planned. When files are copied from their Main Server to a Backup Server the entire website hangs. No one can visit the website anymore until all transfers are complete.

What's weird is that other websites, running on the same lighttpd server, remain accessible and even phpmyadmin keeps on trucking.

STEP 1: -First, a "job" gets created in mysql table "redundfilesjobs" for a file that needs more backup servers than it currently has. -Then, an available backup server gets assigned to said job

S开发者_如何学运维TEP 2: -After a user approves all the jobs that have a server assigned at once, each job gets copied to "redundfiles" with the field "processed" set to 0 after which the "sendfilestoservers" function is immediatly called -The sendfilestoservers function is built so it can accept both a pre-defined job for sending, as well as send all files with the field "processed" set to 0 -In this case, it gets called with a pre-defined job ID that needs to be processed

sendfilestoservers($lastid);

The function:

<?php
    function sendfilestoservers($specificjob){

    if($specificjob != ''){ $result = mysql_query("SELECT * FROM redundfiles WHERE processed=0 and id=$specificjob"); }
    else{ $result = mysql_query("SELECT * FROM redundfiles WHERE processed=0"); }

    while($row = mysql_fetch_array($result))
    {
        $fileid=$row['fileID'];
        $result2 = mysql_query("SELECT files.name, files.subfolder, mainservers.path, mainservers.ftpaddress FROM files LEFT JOIN mainservers ON files.serverID = mainservers.id WHERE files.id=$fileid");
        while($row2 = mysql_fetch_array($result2))
        {
            $serverid = $row['serverID'];
            $result3 = mysql_query("SELECT * FROM redundservers WHERE id=$serverid");
            while($row3 = mysql_fetch_array($result3))
            {
                file_get_contents($row3['sendfileaddress'].'?f='.$row2['name']."&fo=$row2[subfolder]&ftpaddress=$row2[ftpaddress]&mspath=$row2[path]");

                $sql = "UPDATE redundfiles SET processed=1 WHERE id='".$row['id']."'";
                mysql_query($sql) or die(mysql_error());
            }
        }
    }
    }
?>

//As you can see it calls the assigned backup servers' address (example: http://user:pass@serverX.XXX.XX/get.php ) for receiving files, and sends with it directives: file name, the folder it's in on the main server, the ftp address for the main server, and also the path on the main server where all files are stored (inside their respective subfolders) //The backup server is now going to processed all those GET values, and ends up retrieving the file from the main server over FTP using linux' wget

Config on backup server X

<?php
$filename = $_GET['f']; //Filename
$subfolder = $_GET['fo']; //Folder file is stored in
$mainserverID = $_GET['ms']; //Main server the file resides on
$mainserverpath = $_GET['mspath']; //Path on the main server where all the files are stored
$mainserverftpaddress = $_GET['ftpaddress']; //Main server's FTP address or IP, as it may differ from the web address.
$dlto = "/var/www/serverX.XXX.XXX/"; //This backup server's main path. This needs to be changed on every backup server.
$dltodir = "files"; //Folder files are stored in (follows $dlto)
$serverID = $_GET['s']; //Backup-servers ID as it's stored in the main servers database, some other scripts use this.

function execOutput($command) {
$output = array($command);
exec($command.' 2>&1', $output);
return implode("\n", $output);
}
?>

"get.php" script on backup server X

<?php
include("config.php");

    if (file_exists($dltodir.'/'.$subfolder.'/'.$filename))
    {
    }
    else
    {
        exec('mkdir '.$dlto.$dltodir.'/'.$subfolder);
        exec('chmod 777 '.$dlto.$dltodir.'/'.$subfolder);
        exec('wget ftp://user:pass@'.$mainserverftpaddress.'/'.$mainserverpath.'/'.$subfolder.'/'.$filename.' -P '.$dlto.$dltodir.'/'.$subfolder);

    }
?>

//And then we go back to Step 2 for the next approved file.

The first steps (creating the job in "redundfilesjobs" and copying the job to "redundfiles") go fast and I don't notice them causing hang ups. But for some reason the actual sending (or rather, having the backup server retrieve it) just makes the website freeze up.

I hope I provided all the info required for everyone to understand how the system is supposed to work. Can anyone tell me why the entire website hangs and becomes unaccessible for everyone until all files are copied?


I think I may have traced the problem back to my php-fpm configuration. Although I'm still not 100% sure, it seems to be working now. Worth a try if someone experiences similar problems;

I had:

pm = dynamic
pm.max_children = 2
pm.start_servers = 1
pm.min_spare_servers = 1
pm.max_spare_servers = 1

I changed to;

pm = dynamic
pm.max_children = 5
pm.start_servers = 2
pm.min_spare_servers = 1
pm.max_spare_servers = 3

The previous config had problems with file_get_contents() for some weird reason.. I used the following script to test it;

<?php
    //phpinfo();
    if(isset($_GET["loop"])) $loop = $_GET["loop"];
    if(isset($_GET["show"])) $show = true;

    if(isset($_GET["s"]))
    {
        if($_GET["s"] == "ser") $address = "http://www.this-server.com/some_file.php";
        elseif($_GET["s"] == "com") $address = "http://www.external-server.com/some_file.php";
    }else die("s not set");

    if(!isset($loop)||!is_numeric($loop))
        echo file_get_contents($address);
    else
    {
        $count = 1;
        $success = 0;
        while(($count-1) < $loop)
        {
            $got = file_get_contents($address);
            if(!is_bool($got) && $got !== false)
            {
                if(isset($show)) echo $count." : ".$got."<br/>";
                $success++;
            }
            $count++;
        }
        echo "We had $success successful file_get_contents.<br/>";
    }
?>

If I went to script.php?s=serv and spammed F5 loads of times, the php-fpm process belonging to the current website would hang. script.php?s=com worked fine, however.

What also worked fine, is going on the second server and doing it the other way around.

What also worked fine is running the loop (script.php?s=ser&loop=50&show) rather than spamming F5. That's what I find especially odd. Seems liek it's not the number of requests it has problems with. And also not the file_get_contents() itself (because it works as long as the file requested is not on the same server, even when spamming F5).

Conclusion;

With the old config, spamming F5 on a script containing file_get_contents() with a call to a file on the same server, causes the php-fpm process to hang.

I'm kinda baffled by this...

0

精彩评论

暂无评论...
验证码 换一张
取 消