π€ Robots.txt Generator
Create a customized robots.txt file for your website
1
Global Settings π
Default - All Robots Are
Controls global access for all search crawlers
Allowed
Disallowed
Crawl Delay β±οΈ
Sets delay between each bot request to the server
Default - No Delay
5 seconds
10 seconds
20 seconds
60 seconds
120 seconds
2
Sitemap Settings πΊοΈ
Sitemap URL
Enter the URL of your XML sitemap
3
Search Robots π
Customize access for specific search engines
Set permissions for individual search engines
Select All
Google
Default
Allow
Disallow
Google Image
Default
Allow
Disallow
Google Mobile
Default
Allow
Disallow
MSN Search
Default
Allow
Disallow
Yahoo
Default
Allow
Disallow
Yahoo MM
Default
Allow
Disallow
Yahoo Blogs
Default
Allow
Disallow
Ask/Teoma
Default
Allow
Disallow
GigaBlast
Default
Allow
Disallow
DMOZ Checker
Default
Allow
Disallow
Nutch
Default
Allow
Disallow
Alexa/Wayback
Default
Allow
Disallow
Baidu
Default
Allow
Disallow
Naver
Default
Allow
Disallow
MSN PicSearch
Default
Allow
Disallow
4
Restricted Directories π«
Add paths that you want to disallow. All paths must be relative and end with a /
Add directory paths you want to restrict
Generate Robots.txt
Reset
Download Robots.txt
π Your Generated Robots.txt
Copy
Copied!