Why You Need a Robots.txt File For Your Website & How to Create One

Create a Robots.txt File for Your WebsiteIf you’re a website owner and you’ve never heard of a “robots.txt” file, you’re not alone.

A robots.txt file is a text file that gives instructions about your website to web robots (does the term “Googlebot” sound familiar?), officially called The Robots Exclusion Protocol.

When a web robot first visits your site and attempts to crawl it and index your pages, it will look for this file for instructions first. If there are pages, files or folders that you don’t want crawled and returned in search result, you would list them in your robots.txt file.

And to answer your question, yes, there are always files, folders & pages that you don’t want crawled and indexed.

While Google says that you only need a robots.txt file if you have pages you don’t want crawled, they also frown upon sites that don’t have one, so it’s a simple step to take to attempt to please the Google Gods.

How to Create a robots.txt File

If you have a WordPress site, there are 2 ways to create a robots.txt file.

The first is to create it manually with a text editor, such as Notepad. Your file would look something like this:

User-agent: *Disallow: /imagesDisallow: /downloads

Disallow: /subscriber-thank-you

Disallow: /subscription-confirmation

Disallow: /privacy-policy

Disallow: /terms-conditions

Allow: /wp-content/uploads/

You would use the ‘Allow’ syntax if there is a folder or file you want search engines to access that is within an already disallowed parent folder or directory.

Once you have created your robots.txt file, save it with that exact file name and upload it to the root directory of your website, using an FTP (File Transfer Protocol) program such as FileZilla. If you don’t have FileZilla or another FTP program, most hosting accounts allow this type of file uploading through their own user dashboards as an easy alternative.

Manually creating your robots.txt file can also be done for any website, not just a WordPress site.

If you do have a WordPress site, you can also use the feature in many SEO plugins, such as All In One SEO Pack.

If you have this plugin installed, in your WordPress dashboard navigate to All in One SEO > Feature Manager and activate the Robots.txt & File Editor features.

robots.txt file in AIO SEO

The Robots.txt features allows you to create your file and add instructions. Once it has been created, it can be edited using the File Editor feature and then saved.

Don’t forget to go back to your file after making changes to your website structure so that you can review it and update it as necessary to maintain optimum SEO for your website.

Have questions or comments about adding this to your website? Please leave them in the comment section below!

Save

Comments