Top "Robots.txt" questions

Robots.

django serving robots.txt efficiently

Here is my current method of serving robots.txt url(r'^robots\.txt/$', TemplateView.as_view(template_name='robots.…

python django robots.txt
How to allow crawlers access to index.php only, using robots.txt?

If i want to only allow crawlers to access index.php, will this work? User-agent: * Disallow: / Allow: /index.php

seo web-crawler robots.txt
How to stop search engines from crawling the whole website?

I want to stop search engines from crawling my whole website. I have a web application for members of a …

security .htaccess robots.txt
Is it possible to list multiple user-agents in one line?

Is it possible in robots.txt to give one instruction to multiple bots without repeatedly having to mention it? Example: …

user-agent robots.txt
robots.txt allow root only, disallow everything else?

I can't seem to get this to work but it seems really basic. I want the domain root to be …

robots.txt
Where to put robots.txt file?

Where should put robots.txt? domainname.com/robots.txt or domainname/public_html/robots.txt I placed the file in …

seo web-hosting robots.txt
robots.txt and .htaccess syntax highlight

Is there a way to colorcode/highlight robots.txt and .htaccess syntax? E.g. with a SublimeText2 plug-in. I found …

.htaccess sublimetext2 robots.txt
how to ban crawler 360Spider with robots.txt or .htaccess?

I've got a problems because of 360Spider: this bot makes too many requests per second to my VPS and slows …

.htaccess search-engine web-crawler bots robots.txt
Is it possible to control the crawl speed by robots.txt?

We can tell bots to crawl or not to crawl our website in robot.txt. On the other hand, we …

search-engine robots.txt google-crawlers
how to restrict the site from being indexed

I know this question was being asked many times but I want to be more specific. I have a development …

.htaccess search indexing robots.txt