Many people are focused on duplicate content penalties. I’ve haven’t seen the duplicate content issues as big a problem as people make it out to be but here are some very helpful tips on handling vBulletin to reduce any chance of duplicate indexing.
1) disable the "search engine friendly" archive. All this does is create a duplicate copy of your entire site, which is what we are trying to avoid.
2) Add the following entries to your robots.txt file This will stop bots from crawling pages they don’t need to crawl:
Code:
User-agent: *
Disallow: /archive/
Disallow: /attachments/
Disallow: /calendar.php
Disallow: /clientscript/
Disallow: /cpstyles/
Disallow: /customavatars/
Disallow: /customprofilepics/
Disallow: /images/
Disallow: /includes/
Disallow: /login.php
Disallow: /newreply.php
Disallow: /newthread.php
Disallow: /private.php
Disallow: /register.php
Disallow: /sendmessage.php
Disallow: /sendpm.php3) eliminate the " « Previous Thread | Next Thread » " bread crumb at the bottom of threads. This can be found near the bottom of the "SHOW THREADS" template. The problem is that these two links create two additional copies of threads (e.g. /forum/showthread.php?t=87654&goto=nextoldest). This is bad, very bad and how many people actually notice these links actually exist let alone use them? Oh and yes DP needs to kill these two links.
This is good advice for any CMS. Just use the robots.txt file to handle the principal crawlers and keep one’s site out of trouble. Some of these shopping sites should do the same thing with their multiple access points to the same material.
Alec Kinnear
Alec has been helping businesses succeed online since 2000. Alec is an SEM expert with a background in advertising, as a former Head of Television for Grey Moscow and Senior Television Producer for Bates, Saatchi and Saatchi Russia.
Leave a Reply