We have just received news that Heroku's 24-hour+ outage due to DDoS attack is finally over. I have a question regarding communication with users: when a site is completely down, how can I still maintain contact with my users? I'm thinking about开发者_开发技巧 these two options:
- Users entering
www.mysite.com
are automatically redirected to a status page, much likestatus.heroku.com
, which is running independently and can provide updated information and a way to talk. - Failing #1, setting up a simple webpage elsewhere called
status.mysite.com
that I have to tell users about beforehand.
Is it possible to do an automatic redirect to a different website if my Heroku-based one goes down?
What services should I use to host a simple status page that are as independent from Heroku's infrastructure as possible?
Presuming you have registered your domain somewhere that is not Heroku, you could simply change the main DNS entry for your domain to point to a different IP address.
You could, for example, create a very simple site on a free Amazon EC2 micro instance, and in a pinch, change the DNS for your domain to point to the simple EC2 site.
It takes a little while for that sort of DNS entry to propagate (between ~1 minute and a couple of hours), so that's only a useful strategy for a long outage.
Assuming www.mysite.com points directly to Heroku's servers, it is not possible to implement option 1 in a robust way. Changing your DNS entries is by far not quick enough for everyone on the internet. Some ISPs might cache DNS entries for as long as 24 hours.
Option 2 is easy to implement. To be as independent from Heroku's infrastructure as possible I would advice not to use Amazon's cloud offerings as a hosting service. Simply because Heroku uses that platform itself. I would suggest to have a look at Google App Engine. Also free for small sites, very robust, and completely independent of anything related to Heroku and/or Amazon.
Would you hide the main server behind a firewall, and use a quid server as a proxy, with a watchdog. The squid server will be more robust to DOS attach. It would switch to a fall back mode it main server went down.
The domain is registered to point at the fast efficient squid server. The squid server goes to the main (hidden server) to fetch a page. Squid can also cache static pages. It the squid server detects a DOS attack ( it or main server is congested) then it can serve a static site is down page. The squid server will also make the site less susceptible to DOS attack.
Should not matter what the back-end (main server) is running, as long as it can indicate how long a page can be cached. I expect most do.
精彩评论