What is robots.txt file ?

Robots.txt is a file that tells search engine spiders to not crawl certain pages or sections of a website. Most major search engines (including Google, Bing and Yahoo) recognize and honor Robots.txt requests. Technical robots.txt syntax User-agent: The specific web crawler to which you’re giving crawl instructions (usually a search engine). Disallow: The command used … Read more