robots

Simple SEO Fix: Robots.txt File

The robots.txt file is a small text file that is placed on your website server that tells web crawlers like Googlebot or Bingbot if they should access a file or not. Improper usage of the robots.txt file can hurt your SEO ranking because the robots.txt file controls how search engine spiders see and interact with the pages on your website.

Do I have a robots.txt file?

You can check this from any browser. The robots.txt file is always located in the same place on every website, so it’s very easy to determine whether a site has one or not. Just add “/robots.txt” to the end of a domain name as shown below.

www.connect4consulting.com/robots.txt

If you have a file there, it is your robots.txt file. You will either find a file with words in it, a blank file, or no file at all.

Is your robots.txt file blocking important files?

When your developer creates your website, he or she will add code to the robots.txt file to make sure that it’s not indexed. This code is User-agent: * Disallow: /”.  If your robots.txt file says this, you won’t appear in Google’s organic search results.

What to do next:

  • If you see “Disallow: /”, talk to your developer immediately. There could be a good reason it’s set up that way, or it may be an oversight. If there’s content after the “Disallow: /”, then the robots.txt file could be set up correctly, but it warrants a discussion with your developer.
  • If you have a complex robots.txt file, like many ecommerce websites, you should review it line-by-line with your developer to make sure it’s correct.