I want to get the filename, file size and last modified time of each file on an FTP server, then populate a listView with it.
It worked really well until I switched FTP host, and now it's really sluggish, despite the new host being just as fast in real FTP clients.
Any apparent reason as开发者_开发百科 to why?
Also, is it really neccessary to send the login credentials each time?
I'm using the first method to get a string array, then iterate through it and use the second one on each item to get the file size:
public static string[] GetFileList()
{
string[] downloadFiles;
StringBuilder result = new StringBuilder();
FtpWebRequest request;
try
{
request = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://mysite.se/"));
request.UseBinary = true;
request.Credentials = new NetworkCredential(settings.Username, settings.Password);
request.Method = WebRequestMethods.Ftp.ListDirectory;
request.UseBinary = true;
WebResponse response = request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string line = reader.ReadLine();
while (line != null)
{
result.Append(line);
result.Append("\n");
line = reader.ReadLine();
}
// to remove the trailing '\n'
result.Remove(result.ToString().LastIndexOf('\n'), 1);
reader.Close();
response.Close();
return result.ToString().Split('\n');
}
catch (Exception ex)
{
System.Windows.Forms.MessageBox.Show(ex.Message);
downloadFiles = null;
return downloadFiles;
}
}
public static int GetFileSize(string file)
{
//MessageBox.Show("getting filesize...");
StringBuilder result = new StringBuilder();
FtpWebRequest request;
try
{
request = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://mysite.se/" + file));
request.UseBinary = true;
request.Credentials = new NetworkCredential(settings.Username, settings.Password);
request.Method = WebRequestMethods.Ftp.GetFileSize;
int dataLength = (int)request.GetResponse().ContentLength;
return dataLength;
}
catch (Exception ex)
{
//System.Windows.Forms.MessageBox.Show(ex.Message);
return 1337;
}
}
The problem is that each GetFileSize call has to reconnect to the server and issue a request for the file size. If you can set things up to use a single, persistent connection then you'll save connection time, but will still spend a lot of time sending a request for each file and waiting for a response.
(Edit: this may already be the case. MSDN says: Multiple FtpWebRequests reuse existing connections, if possible.)
If you use ListDirectoryDetails rather than ListDirectory, then the server will probably send down more information (file size, permissions, etc) along with each file name. This wouldn't take any longer than just doing ListDirectory, and you could pull the name and size out of each line and store the sizes for later.
However, different servers may send down the information in different formats, and some may not send the size info at all, so this may not help if you need your program to reliably use any FTP server.
Not a correct or even good answer:
Here is a PowerShell test example that show how this works:
$request=[System.Net.FtpWebRequest]::Create('ftp://ftp.hp.com/control/SavvisLoad.whp-ftp.xml')
$request.UseBinary=$false
$request.Method=[System.Net.WebRequestMethods+Ftp]::GetDateTimestamp
$request.GetResponse()
This uses an HP public server and shows how to get file size.
ListDirectoryDetails gets the folder informaion.
The biggest error above is that the mode is set to Binary. All directory listings and file info must use text mode.
Here is a proven and fast way to get a directory listing. It can be tested as-is with PowerShell.
[System.Net.FtpWebRequest]$request = [System.Net.WebRequest]::Create("ftp://ftp.hp.com/control")
#$request.Credentials=New-Object System.Net.NetworkCredential('Anonymous','johnjones3@msn.com')
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectoryDetails
$request.UseBinary=$false
$response=$request.GetResponse()
$stream=$response.GetResponseStream()
$b=new-object byte[] 1024
$stream.Read($b,0,$b.Count)
$b|%{$s+=[char]$_}
$s
精彩评论