开发者

How do I Make a Web Crawling Application User-Friendly

开发者 https://www.devze.com 2023-01-05 02:25 出处:网络
I\'m creating a web crawling application that I want the \'average\' user to be able to use.I\'m concerned that a web crawling application is probably just too complex for most users though because us

I'm creating a web crawling application that I want the 'average' user to be able to use. I'm concerned that a web crawling application is probably just too complex for most users though because users need to:

  1. Understand URL structure (domain, path, etc...).
  2. Understand crawling 'depth'.
  3. Understand file extensions and be able to setup 'filters' to narrow their crawling to achieve better performance (or they'll be frustrated with the program).
  4. Understand where URLs are found in pages (image srcs, links, plain text URLs, etc...).

What can I do to help users get quickly acquainted with my program? Or even better, what can I do so the progr开发者_开发技巧am is intuitive enough that users just 'get it'? I know this seems pretty broad, but if you can confine your answers to web crawlers that should help. I've read up on general usability, ui design, etc... but I'm struggling with the domain I'm working in. Thanks.


Just because a web crawler is complex in implementation, doesn't mean it has to be complex to use. Only offer what is really necessary, use sensible defaults for the rest. That will get you 80% of use cases, and then rely on the other 20% being more willing for have a deeper understanding.

  1. Why should they have to understand this? Depends on the expected usage, but I would of assumed most uses where crawling a full website, so only the domain is needed.
  2. Gert G's suggestion of a slider with extending folder structure was a good one. This doesn't have to be dynamic with the site in question, just an illustration of what it means.
  3. Forget exposing file extensions, instead offer common types of file with icons, possibly even grouping them (e.g. all common image types, jpg, png, gif, go into one 'images' type). Only give raw file extension settings under an advanced config section, those that need it will understand it.
  4. I don't really see why they need to understand this? Surely that's a job for the crawler.


Some ideas:

  • Make an interactive user interface (e.g. a slider for depth, which shows a small picture of folders and subfolders opening as they move the slider)
  • Avoid clutter. Divide the settings into logical tabs.
  • Make video tutorials for the things you need to teach them.


Perhaps you could have a picture of "the web" showing two or three pages each from two or three websites. As the user selected where to find links (for example, images, plain text, links, etc), the parts of the page they selected would be highligted in the images.

0

精彩评论

暂无评论...
验证码 换一张
取 消