Create a Robots.txt File to Avoid Duplicate Content and Boost WordPress SEO

Create a Robots.txt File to Avoid Duplicate Content and Boost WordPress SEO

Robots.txt is a simple text file that you upload to your server root. It contains instructions for search engines indexing your site, ie. follow that link, don’t index this directory, etc.

You can get help creating a WordPress-specific robots.txt file here: Robots.txt For WordPress.

Discussion is open below in the comments if you want to add to this tip.

Want to Submit a Daily Tip to WPMUDEV?

If you’ve got a great tip for WordPress, WordPress Multisite, or BuddyPress users, send it our way on Twitter: @wpmudev and we’ll happily credit you. Create a tweetable title and let us know if you have a more info or an article you’d like to link it to.

Free Video Why 100 is NOT a Perfect Google PageSpeed Score (*5 Min Watch) Learn how to use Google PageSpeed Insights to set realistic goals, improve site speed, and why aiming for a perfect 100 is the WRONG goal.
Tags: