I'm building a C# application using .NET 3.5 that will require logging in to a开发者_开发技巧 website select some options and then proceed to another page that contains a links to an Excel file. I want to know how I can proceed with these steps.
1) Load the web page (either hidden or visible in a window in application, preferred hidden).
2) Type in the username/password combination and submit them to the website.
3) Choose the appropriate options depending on the account provided (just need to know how can I go through check boxes and lists).
4) Proceed to the next page and load the file in the link (maybe download it to a temp location, read it then delete).
UPDATE:
I have an application that I use at work that does the following ...
when I run the application it logs into a server to authenticate my session. The application has a button that when clicked shows/hides the current activity on the page using the webpage itself.
The application loads the page, types the username then password and submits. Then authenticates the session on the server.
But my application is different somehow, my application will load the webpage (home page) logs in then chooses some options (I`ll set those options depending on the page) then on the next page, there will be a link that I`ll load in excel or my application (whichever I`ll be able to do), then will perform my operations on the result based on the user input.
Series of POSTs and GETs using HttpWebRequest
and HttpWebResponse
will get you trough.
Although this is not an easy task at all. You will also need some tool to actually get the webpage's "logic", or in other words sniff it to see how you have to organize your GETs and POSTs. Use a tool like Fiddler to accomplish this task.
Once you get around what needs to bet GET and what needs to be POSTed, then you can get onto the programming part.
EDIT: The only way you can load a page is trough a HTTP GET(HttpWebRequest), the only way you can login and choose/specify options is trough HTTP POST (HttpWebResponse) that is if the server doesn't use some kind of web interface(web services or WCF which is a completely different matter)
And after all that when you actually receive a response you need to parse it for the data you need using regex or whatever suits your needs.
As far as the excel is concerned. If you want to load the excel in your application and do stuff to it there's a whole API from Microsoft for this purpose: http://www.microsoft.com/download/en/details.aspx?DisplayLang=en&id=5124
This not a trivial task. This is called web scraping.
Ideally a webpage should have an alternative API for machine processing. RPC/SOAP/REST or the kind.
If you absolutely must do this then HttpWebRequest
& HttpWebResponse
are your friends. For html parsing you are on your own, choose what is most appropriate tool (regex, string parser, or something more robust.
However this will be a pain to maintain.
If I'm reading this correctly you need a C# Winforms app to manipulate a website?
If so your best way forward is to simply use a number of HTTP requests, and use your app to capture and parse the response. You have everything you need to do this in the System.Web namespace, specifically HttpWebRequest and HttpWebResponse.
I can't really go into more details unless you more specifically ask your question, however it would be best if you did have an alternate way to interface with the data you require (think webservices, JSON/XML)
精彩评论