How to configure robots.txt to allow everything?

Raajpoot picture Raajpoot · Nov 25, 2010 · Viewed 125.7k times · Source

My robots.txt in Google Webmaster Tools shows the following values:

User-agent: *
Allow: /

What does it mean? I don't have enough knowledge about it, so looking for your help. I want to allow all robots to crawl my website, is this the right configuration?

Answer

Jim picture Jim · Nov 25, 2010

That file will allow all crawlers access

User-agent: *
Allow: /

This basically allows all user agents (the *) to all parts of the site (the /).