On Tuesday I gave two presentations at the Shop.org Annual Summit. First was “Natural Search Tactics for the Retailer” with fellow panelists Ken Jurina from Epiar, Jenny Schlueter from Dell, and Ian McAnerin from McAnerin International. It was a very tactical session – focused on tools, tips and techniques. Ken covered keyword research, Jenny covered content optimization, Ian covered technical optimization, and I covered link building.
Some are a few key points from the session…
- Mine multiple data sources for keyword data, such as Google Keyword Tool, Trellian Keyword Discovery, Google Webmaster Tools, Google Suggest, Google Trends, internal site search logs, referrer logs, and PPC broad match.
- Determine keyword difficulty with the SEOmoz Keyword Difficulty Tool.
- Useful link analysis/link building tools include LinkResearchTools.com, Majestic, and Moz Link Explorer.
- Types of links that are likely to get discounted include: reciprocal links, affiliated sites (on the same IP range or hostname), footer links (at the bottom of the page), site-wide links, links contained on a page called links.htm / links.asp, and links with the exact same anchor text. Remember that the more links on the linking page, the less PageRank you’ll get.
- Review your existing links using a link analysis tool like Majestic or LinkResearchTools.com and contact those webmasters who link to you with suboptimal anchor text. Focus on the highest value links where you have rapport or influence with the webmaster.
- If you have multiple servers, rotating IP addresses (load balancing) can make it look to Google like you have duplicate copies of your site, and edge computing can cause geolocation issues. The fix is to detect spiders and send them to a canonical site version.
- Use the country code TLD for a country, use a subdomain for language or major group, and use subdirectories for topics. e.g. language.company.ccTLD/topic/page.htm
- A gTLD (.com, .net) is almost always geolocated via IP address. A ccTLD (.ws, .la, .tv) overrides IP geolocation. The duplication problem does not affect clearly geolocated sites. Always declare language and character types on web pages.
- Scripts that build links on the fly, AJAX, and Flash are bad for SEO. An Iframe is treated as a separate page; includes are better. Use a spider simulator such as SEO-Browser to test your site.
- Search engines strip most HTML code out of a document before parsing, so most HTML validation errors do not affect rankings. Exception: some errors, such as a missing “<” can kill the indexing of your page. Best Practice is to always validate your code.
The Powerpoint, which includes all 4 presentations, is available for download here.
The second session I presented was with Amy Africa and it was a site clinic session where Amy and I did impromptu critiques of audience members’ websites. Amy covered usability and conversion; I covered SEO. It was a lot of fun. There were no Powerpoints for that session.
Great stuff Stephan! The way you share information in easy to grasp bits is very nice. Quick question, when you say “Remember that the more links on the linking page, the less PageRank you’ll get.” I’m a bit confused. Is this a reference to Google not liking a page that has too many links, or is there an issue with your linking page “sharing away” your link love, which impacts your own site’s PR?
Affan, indeed you are correct – links from inside an individual blog post are better than site-wide links. This is because Google prefers links that appear to be earned by merit.
Craig, being 1 link out of 10 total links on somebody’s page is better than than being 1 out of 100 because the PageRank on that person’s page is divvied up amongst all the links on the page.