In an article I had co-authored for Multichannel Merchant entitled, “Red Envelope’s Website Critique,” I had gone into detail about the site’s functionality and came across this interesting discovery at the time I looked at their site.
The category and subcollection pages are not making it into the search engines at all — not because of their spider unfriendly URLs, but because they are being specifically blocked through “disallow†directives in the site’s robots.txt file. Robots.txt is the place where you can give commands to Googlebot and the other spiders, such as “stay away from this directory†or “stay away from this file type.â€
For more interesting details on this site critique co-written with Amy Africa, just follow the link above.
Leave a Reply