Without divulging TOO much information, I need to setup a web server system that is intended to be used by end users all over the internet.
the use case is such that:
- end users are (usually) in their homes behind their local firewalls when connecting to the system.
- The system consists of a remote server hosted by us, strictly over https 开发者_Python百科(using SSL)
- The authorization mechanism requires user account self-creation on the remote server which, upon successful account creation, will then require a piece of software to be downloaded and installed to the end users' computer. This software contains, among other things, a local webserver.
- This "local" webserver must also only allow https connections to the user's browser.
Since the distributed software will be a unique web server on every individual users' machine, I'm unsure how or even if it is possible, to get a THIRD PARTY SIGNED SSL certificate that won't cause trustworthiness errors when the user connects to it via the web browser. Of course it can use self-signed SSL certs but the idea is to avoid the browser warnings so that the end users will implicitly "trust" data coming from their own application running its webserver over SSL.
Is this possible?
localhost
You will never be issued a proper https cert for localhost. It is strictly forbidden. Because reasons.
In short:
- Misconfigured devices actually exist, in the wild, that wait for lookups before resolving localhost from
/etc/hosts
- If a router defines
localhost.foo.local
it may causelocalhost
to resolve incorrectly (you've probably seen this class of error before)
You can create a root certificate and then create a so-called "self-signed" certificate, signed by the root ca you created. You'll still get the ugly warning screen, but it'll work.
- See https://coolaj86.com/articles/how-to-create-a-csr-for-https-tls-ssl-rsa-pems/
2023 Update: caddy
The solutions below still apply, but caddy can make them even easier.
It can create root certificates that plug into your OS keychain, and there's a DuckDNS plugin so you could have public certs on a public or private IP via DNS validation, etc.
localhost.YOURSITE.com (points to 127.0.0.1)
In lieu of actual localhost
certs, I do what Eugene suggests - create a 127.0.0.1 record on a public domain.
You can get free HTTPS certificates for localhost.YOURSITE.com
via Let's Encrypt via https://greenlock.domains. Just choose the DNS option instead of the HTTP File Upload option
Point your localhost.MY-SLD.MY-TLD to 127.0.0.1
- Purchase a
*.localhost.example.com
cert and issue each installation a secretxyz.localhost.example.com
(and include it in the public suffix list to prevent attacks on example.com) - Use a greenlock-enabled app to generate such certificates on the fly (through https://letsencrypt.org) directly on the client (or pass them to the client)
If you do not get included in the PSL note that:
- sessions, localstorage, indexeddb, etc are shared by domain
- changing the port does not change their sharedness
Be Your Own Root Certificate
Update: with things like greenlock that use ACME / Let's Encrypt, this is no longer particularly relevant.
This is probably a really bad idea because we don't want users becoming accustomed to installing Root CAs willy nilly (and we know how that turned out for Lenovo), but for corporate / cloned machines it may be a reasonable low-budget option.
I had this same requirement. So the reason why you have to use SSL is because just about every browser now barfs if you use https and try to connect to an http resource even if the http resource is on localhost which is silly to me.
Because of JS SOP our localhost web server serves up a js file and then the JS inside the webapp can make calls to this localhost webserver.
So we made local.example.com point to 127.0.0.1 and actually bought an SSL certificate for this hostname. We then ship the private key inside this web server which gets installed on the user's computer. Yes, we're crazy.
All of this actually works quite well. We're been running like this with a few hundred users for about 6 months now.
The only problem we sometimes run into is that this doesn't work right when a user is using a proxy server. The requests are sent to the proxy server and it tries to connect to 127.0.0.1 at the proxy server which obviously doesn't work. The work-around is to add an exclusion to the proxy server config so that it bypasses the proxy server for requests to local.example.com
Another scenario where it will get a little tricky is when users try to use Citrix or Terminal Services. You have to make sure the web server for each user is running on a different port and then inform your remote web server of the port number so that pages generated on the server will have the right port number. Fortunately we haven't run into this yet. It also seems like more people are using virtual machines these days instead of Citrix.
Did you ever find a better way?
Probably you can make us of this offering by GlobalSign (other CAs offer comparable services). In brief, the offering lets you have a CA certificate (and enroll end-user certificates for localhost / whatever ) which will be signed by GlobalSign certificate. The cost can be significant though (I believe they determine it on case by case basis).
Since you're on localhost, you can tell your browser to trust any certificate you want.
Make a self-signed certificate for localhost, and tell your browser to trust it.
The solution "Point your localhost.MY-SLD.MY-TLD to 127.0.0.1" provided by provided by CoolAJ86 works fine, and you see a more detailed explanation here:
How PLEX is doing https for all its users
PS: I just don't know how sustainable this is, because someone with a similar scenario had their key revoked by the CA as if the key had been compromised.
精彩评论