I'm trying to test some software that analyzes web requests from browsers and other sources. Is there a tool that will rapidly and repeatedly make requests to various urls? These urls could be random, from a list, or 开发者_StackOverflowas they are found on the pages that are requested. It makes little difference to me.
Is there such at tool? Perhaps a browser plugin? Or should I just write something myself?
Try cURL
curl is a tool to transfer data from or to a server, using one of the supported protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, TELNET and TFTP). The command is designed to work without user interaction.
Sample:
Get the main page from Netscape's web-server:
curl http://www.netscape.com/
More in the manual.
精彩评论