• Skip to content
  • Skip to primary sidebar
  • Skip to footer

Foliovision

Making the web work for you

Main navigation

  • Weblog
    • FV Player
    • WordPress
    • Video of the Week
    • Case Studies
    • Business
  • About
    • Testimonials
    • Meet the Team
    • We Support
    • Careers
    • Contact
    • Pricing
  • Products
  • Support
    • FV Player Docs
    • Pro Support
  • Login
  • Basket is empty
Affordable VAST/VPAID for Wordpress has arrived. Serve ads with your videos starting today!

Using robots.txt to avoid CMS Duplicate Content penalties from Google

2 December 2006 / Alec Kinnear / Leave a Comment

Many people are focused on duplicate content penalties. I’ve haven’t seen the duplicate content issues as big a problem as people make it out to be but here are some very helpful tips on handling vBulletin to reduce any chance of duplicate indexing.

1) disable the "search engine friendly" archive. All this does is create a duplicate copy of your entire site, which is what we are trying to avoid.

2) Add the following entries to your robots.txt file This will stop bots from crawling pages they don’t need to crawl:

Code:
User-agent: *
Disallow: /archive/
Disallow: /attachments/
Disallow: /calendar.php
Disallow: /clientscript/
Disallow: /cpstyles/
Disallow: /customavatars/
Disallow: /customprofilepics/
Disallow: /images/
Disallow: /includes/
Disallow: /login.php
Disallow: /newreply.php
Disallow: /newthread.php
Disallow: /private.php
Disallow: /register.php
Disallow: /sendmessage.php
Disallow: /sendpm.php3) eliminate the " « Previous Thread | Next Thread » " bread crumb at the bottom of threads. This can be found near the bottom of the "SHOW THREADS" template. The problem is that these two links create two additional copies of threads (e.g. /forum/showthread.php?t=87654&goto=nextoldest). This is bad, very bad and how many people actually notice these links actually exist let alone use them? Oh and yes DP needs to kill these two links.

This is good advice for any CMS. Just use the robots.txt file to handle the principal crawlers and keep one’s site out of trouble. Some of these shopping sites should do the same thing with their multiple access points to the same material.

Alec Kinnear

Alec Kinnear

Alec has been helping businesses succeed online since 2000. Alec is an SEM expert with a background in advertising, as a former Head of Television for Grey Moscow and Senior Television Producer for Bates, Saatchi and Saatchi Russia.

Share
Tweet
Share
0 Shares

Categories: SEO

Related Posts

  1. How to move an old website to a new site address and retain Google rankings

    How to move an old website to a new site address and retain Google rankings

  2. FV WP Link Robot Installation

    FV WP Link Robot Installation

  3. FV Simpler SEO: Google Authorship

    FV Simpler SEO: Google Authorship

Reader Interactions

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

You can click here to Subscribe without commenting

Primary Sidebar

Categories

  • Business
  • Cameras
  • Case Studies
  • Design
  • Flowplayer
  • Internet Marketing
  • IT
  • Life
  • SEO
  • Slovak
  • Video of the Week
  • WordPress

Footer

Our Plugins

  • FV WordPress Flowplayer
  • FV Thoughtful Comments
  • FV Simpler SEO
  • FV Antispam
  • FV Gravatar Cache
  • FV Testimonials

Free Tools

  • Pandoc Online
  • Article spinner
  • WordPress Password Finder
  • Delete LinkedIn Account
  • Responsive Design Calculator
Foliovision logo
All materials © 2021 Foliovision s.r.o. | Panská 12 - 81101 Bratislava - Slovakia | info@foliovision.com
  • This Site Uses Cookies
  • Privacy Policy
  • Terms of Service
  • Site Map
  • Contact
  • Tel. +1 518 412 4600