For several years, there was a website in Fort Collins that was making a horrendous mistake. This company is an icon in Fort Collins; one of most well known names in town – and no, I’m not going to tell you who.

There’s a very important file at work when it comes to SEO. It’s called your robots file. It tells Google where it can and cannot go in your website. If you say, don’t visit this page, it won’t. This is done by using one line of code. You have to be extra careful though because one simple typo can deindex your entire site. Oops…

Here are the contents of a basic robots file:

User-agent: *
Disallow:

This is saying, “Attention all search engines and other web crawlers: You can view everything on my site.”

Here is how you should properly tell Google not to go somewhere in your site:

User-agent: *
Disallow: /private/

This is saying, “Attention search engines and other web crawlers: Don’t go in my private directory!”

Here is where some people (like the website I mentioned above) make a serious mistake:

User-agent: *
Disallow: /

This is saying, “Attention everyone: stay out of my website altogether.”

This one simple slash told Google not to pay attention to anything in the site at all. This website did not appear in Google for years. Kind of a big deal wouldn’t you say?

In short, if you don’t know what you’re doing with a robots.txt file, don’t mess with it. If you’d like help with setting one up, give us a call!

Pin It on Pinterest

Share This