We’ve all experienced frustration trying to print out an important web page or form. Some web designers have felt our pain, creating duplicate pages that are “print-friendly.” Unfortunately, these duplicates aren’t great for SEO, as the search engines get confused trying to determine which version of your content to serve up to searchers in their results. There are other negative effects as well, depending upon the size of your site and how you’ve structured it. For example, in my article at CNET I highlight this scenario:
For example, let’s say that you have a Web site that has 1,000 pages, a small to moderate-size site, depending on your perspective. Now, because you’ve taken advantage of your CMS’ ability to automatically create a “print this” link on each page to a printer-friendly version, for all practical purposes, your site just doubled to 2,000 pages. But what if your PageRank isn’t high enough to warrant very rapid spidering? It could take a lot longer for all your pages to get indexed.
For more about this unique situation, and solutions on how to avoid potential duplicate content issues, read my blog post on CNet: Searchlight.
Great topic you bring up about the duplicate content issues. That’s true about doubling the number of pages from 1,000 to 2,000. But, can’t you use robots.txt to exclude the printer-friendly pages and have them in their own directory?