I am trying to host my site that uses DotNetOpenAuth (OpenID) behind ISA 2006 (reverse proxy), and after it authenticated with a provider (such as Google), and it returns with a URL with %253A in the URL. However, ISA HTTP filter rejects the request.
What I need to do is, on ISA web publishing rule, right click > config HTTP policy properties > uncheck "Verify Normalization" and it worked.
- Is this a problem on ISA 2006 generally? Are other firewalls having similar problems?
- Or, is it an OpenID or DotNetOpenAuth issue?
- Is it safe to disable Normalization checking on ISA?
According to MSDN, quote "Web servers receive requests that are URL encoded. This means th开发者_StackOverflow社区at certain characters may be replaced with a percent sign (%) followed by a particular number. For example, %20 corresponds to a space, so a request for http://myserver/My%20Dir/My%20File.htm is the same as a request for http://myserver/My Dir/My File.htm. Normalization is the process of decoding URL-encoded requests. Because the % can be URL encoded, an attacker can submit a carefully crafted request to a server that is basically double-encoded. If this occurs, Internet Information Services (IIS) may accept a request that it would otherwise reject as not valid. When you select Verify Normalization, the HTTP filter normalizes the URL two times. If the URL after the first normalization is different from the URL after the second normalization, the filter rejects the request. This prevents attacks that rely on double-encoded requests. Note that while we recommend that you use the Verify Normalization function, it may also block legitimate requests that contain a %."
OpenID messages may often contain double-encoded URLs in their requests. So from the documentation you provided, I'd say you must disable "Verify Normalization" on the reverse proxy.
精彩评论