What is a robots.txt file?
A robots.txt file tells search engine crawlers which pages or files the crawler can or can’t request from your site. It basically tells whether the bot should visit or not the particular page. robots.txt file is one of the key concept of Search Engine Optimization (SEO) to be more specific Technical SEO.
Want to know more Key Concept of Search Engine Optimization (SEO) click here
What is robots.txt used for?
robots.txt is used primarily to manage crawler traffic to your site.
4 responses to “What is a robots.txt file ?”
Robots.txt plays an important role in Search Engine Optimization (SEO)
After your blog I understood that my web missing sitemap.
Thank You
Will be posting more on SEO