开发者

Robots.txt: Disallow subdirectory but allow directory

开发者 https://www.devze.com 2023-02-18 02:03 出处:网络
I want to allow crawling of files in: /directory/ but not crawling of files in: /directory/subdirectory/

I want to allow crawling of files in:

/directory/

but not crawling of files in:

/directory/subdirectory/

Is the correct robots.txt instruction:

User-agent: *
Disallow: /subdirectory/

I'm afraid that if I disallowed /directory/subdirectory/ that I would be di开发者_运维知识库sallowing crawling of all files in /directory/ which I do not want to do, so am I correct in using:

User-agent: *
Disallow: /subdirectory/


You've overthinking it:

User-agent: *
Disallow: /directory/subdirectory/

is correct.


User-agent: *
Disallow: /directory/subdirectory/

Spiders aren't stupid, they can parse a path :)

0

精彩评论

暂无评论...
验证码 换一张
取 消