开发者

HTTP request over TCP dropping data?

开发者 https://www.devze.com 2023-03-29 17:54 出处:网络
I am making a DownloadString function in order to retrieve HTML data (since the WebClient lacks quite a bit of speed =/)

I am making a DownloadString function in order to retrieve HTML data (since the WebClient lacks quite a bit of speed =/)

Here's what i have so far...

    public static string DownloadString(string url)
    {
        TcpClient client = new TcpClient();
        client.Client.ReceiveTimeout = 5;
        string dns = UrlToDNS(url);
        byte[] buffer = new byte[51200];
        client.Client.Connect(dns, 80);
        string getVal = url.Substring(url.IndexOf(dns) + dns.Length);
        string HTTPHeader = "GET " + getVal + " HTTP/1.1\nHost: " + dns + "\nConnection: close\nUser-Agent: Pastebin API 0.1\nAccept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\nAccept-Charset: ISO-8859-1,UTF-8;q=0.7,*;q=0.7\nCache-Control: no-cache\nAccept-Language: en;q=0.7,en-us;q=0.3\n\n";
        client.Client.Send(s2b(HTTPHeader));
        client.Client.Receive(buffer);
        return b2s(buffer);
    }

    private static string b2s(byte[] ba)
    {
        string ret = "";
        foreach (byte b in ba)
            ret += Convert.ToChar(b);
        return ret;
    }

(s2b not necessary since the http server returns OK)

However, when i run the code (with http://www.google.com/ as a test), it seems that some of the开发者_开发技巧 data is dropped/not read:

HTTP/1.1 200 OK
Date: Sat, 20 Aug 2011 15:18:28 GMT
Expires: -1
Cache-Control: private, max-age=0
Content-Type: text/html; charset=ISO-8859-1
Set-Cookie: PREF=ID=3714446c9ffb56bf:FF=0:TM=1313853508:LM=1313853508:S=mu1XpTcwqFTwgwJM; expires=Mon, 19-Aug-2013 15:18:28 GMT; path=/; domain=.google.com
Set-Cookie: NID=50=B8YKlYj7eK84obqC5YO10AKF9jJNcQ5w4NkzidRL9of0Sc24EpbWeP-w7HVfm-eBCfE2NX2QMZAfEBpsqsgjhWqylFUIXU-bs6ObkLQbXJ59sa_daivfBLYJkQvq_WH; expires=Sun, 19-Feb-2012 15:18:2>8 GMT; path=/; domain=.google.com; HttpOnly
Server: gws
X-XSS-Protection: 1; mode=block
Connection: close

<!doctype html><html><head><meta http-equiv="content-type" content="text/html; charset=ISO-8859-1"><meta name="description" content="Search the world&#39;s information, including webpages, images, videos and more. Google has many special features to help you find exactly what you&#39;re looking for."><meta name="robots" content="noodp"><title>Google</title><script>window.google={kEI:"RNBPTvPcI5C_gQeywpHfBg",getEI:function(a){var b;while(a&&!(a.getAttribute&&(b=a.getAttribute("eid"))))a=a.parentNode;return b||google.kEI},kEXPI:"28936,29049,29774,30465,30542,31760",kCSI:{e

To add another complication, it seems to drop a variable amount of data each time; I haven't gotten consistent results with how much data is lost, sometimes it loses only a small amount and sometimes (like the example) a larger amount

Any ideas on what is causing it? (or a better method of retrieving the source code of a webpage without WebClient)

(also ignore the fact that the input and output data hasn't been sanitized)


You should use a WebClient.DownloadString. I very highly doubt that it is this method that is slow and causing you performance problems.

But if you want to reinvent wheels, here's a cleaner approach:

class Program
{
    static void Main()
    {
        using (var client = new TcpClient("www.google.com", 80))
        using (var stream = client.GetStream())
        using (var writer = new StreamWriter(stream))
        using (var reader = new StreamReader(stream))
        {
            writer.AutoFlush = true;
            // Send request headers
            writer.WriteLine("GET / HTTP/1.1");
            writer.WriteLine("Host: www.google.com:80");
            writer.WriteLine("User-Agent: Pastebin API 0.1");
            writer.WriteLine("Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8");
            writer.WriteLine("Accept-Charset: ISO-8859-1,UTF-8;q=0.7,*;q=0.7");
            writer.WriteLine("Cache-Control: no-cache");
            writer.WriteLine("Accept-Language: en;q=0.7,en-us;q=0.3");
            writer.WriteLine("Connection: close");
            writer.WriteLine();
            writer.WriteLine();

            // Read the response from server
            Console.WriteLine(reader.ReadToEnd());
        }
    }
}

Obviously this code doesn't follow HTTP redirects from the server. It is very basic. Much more will be required to get all the functionality you would get from a WebClient.DownloadString.


Socket.Receive() only returns currently available data. If not all data from the page is available yet, it returns only part of it.

If you want to receive all the data, you need to call Receive() in a loop, until it returns 0, because that means all data has been read.

0

精彩评论

暂无评论...
验证码 换一张
取 消