Currently I have a bit of a 'different' set-up. My main files are on server1; this server simply delivers the content through php and mysql. But that's just the front end. In the back, on server2 (homeserver), are alot of different scripts doing various things that could not be done on server1, since it's a shared host and thus resources are limited.
This setup works great. If server2 loses power or something the site won't be updated, but what's there is still available, and it can just catch up once back online. But here's the problem; all scripts on the homeserver are wide open for everyone to execute. As an example, my database-syncing script;
Server1 detects it's databases hasn't been synced up with the one on server2, so it initiates the syncing script.
<?php
//This initiates the script on server2, which then dumps it's database into a .sql file
file_get_contents('http://server2.x.com/backup/backup_mysql.php');
//This reads out and saves said database file locally for processing
$myresult2 = file_get_contents('http://server2.x.com/backup/backups/db-backup.sql');
file_put_contents_atomic("backups/db-backup.sql", $myresult2);
//This will delete the backup file from server2
$deleteurl = 'http://server2.x.com/backup/backup_mysql.php?delete=true';
$myresult3 = file_get_contents($deleteurl);
//This initiates bigdump for processing the sql file
include_once('bigdump.php');
?>
As you can see, this opens up some obvious security flaws. *backup_mysql.php* can be used by anyone who knows the address to server2 and even once that's fixed, som开发者_开发技巧eone who monitors the /backup folder can retrieve the sql backup before my script deletes it again.
How do I prevent all this from happening?
You can use Apache2 webserver directves to deny access to certain locations from all IPs except your own.
You should look into using HTTP authentication to prevent access to everything published by server 2. This way you 'd be able to lock everyone else out of server 2 with minimum hassle.
With HTTP auth in place, your file_get_contents
calls would need to change to include the credentials, for example
file_get_contents(http://user:pass@server2.example.com/data.php);
If you are worried that someone might sniff the credentials from the network, then you can also move to HTTPS. Since both the server the only user will be your own applications, you can create the certificates yourself and make your scripts accept them as valid.
You can protect your files or folders through .htaccess
:
Password protecting your pages with htaccess
Here's another solution besides .htaccess:
server1/index.php
server1/config.php
server2/index.php
Lets say you dont want people accessing config, use the define
function. Define a unique name and check if it's defined in the config.php file before setting variables / methods.
精彩评论