I’m here at Search Engine Strategies Chicago, and today at the “Meet the Crawlers” session I asked the distinguished panel of representatives from the four major search engines the question:
What is your current official position on simplifying the URLs selectively for bots like Googlebot, Yahoo Slurp, etc. by user-agent detection in order to drop session IDs and other superfluous parameters from the URL? Do you consider it cloaking? And if so, is it good cloaking or bad cloaking?
The panel, which included Ramaz Naam from MSN Search, Tim Mayer from Yahoo!, Charles Martin from Google, and Kaushal Kurapati from Ask Jeeves, gave me and the audience their definitive answer. But before they did, Ramez from MSN Search asked for clarification:
Will the same page content display to the user if that user types into their browser the URL that was given to the bot?
I responded with a “Yes,” then all four search engines all confirmed individually:
No problem.
Then Charles Martin from Google jumped in again with:
Please do that!
So there you have it. Whether or not you call this technique cloaking or not, the search engines don’t mind it, and in fact encourage it!
Actually that’s old news, but since dupes became really painful it’s good to bump it. Overdone white hat SEO ‘ethics’ got so many sites tanked …
No surprise there. Did anyone actually think that was something one shouldn’t do?
Hi Jill. This is something we’ve been doing and advising clients to do for a long time. I’ve heard SEOs say that this is cloaking and that cloaking is dangerous, so I just wanted to set the record straight.