Google SEO insight

Earlier this week Matt Cutts of Google published a very interesting and informative entry on his blog, Matt Cutts: Gadgets, Google and SEO. The article, Q & A thread: March 27, 2006, is a series of questions and answers originating from comments raised by readers of his blog.

By the end of this week the Bigdaddy upgrade should be complete across all Google data centres. The Bigdaddy upgrade has introduced a new Googlebot, so if you’re checking visitor stats to your web site the Googlebot now has a User-Agent of ‘Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)’. Now things are settling down after Bigdaddy, Matt is going to request that a new set of PageRanks is made visible in the next couple of weeks. This is subject to there being no logistical obstacles.

An interesting point made was the fact that one of the “classic crawling strategies that Google has used is the amount of PageRank on your pages”. This means that even if your site has been up and running for a couple of years, or that you actively employ sitemap, Google won’t automatically crawl every page. PageRank has been used to help the Google systems decide how often and deep the robots should look at your web site. Matt mentions that getting good quality links would probably help Google know to crawl your site deeper, which is part of our SEO subscription on both SEO4Biz and SEO2U. Another important comment made at this point was one about parameters on URLs. Now traditionally it has been encouraged not to use parameters and therefore, dynamic pages, unless you really have to. Matt states that you should review the URLs for the unindexed pages and see if they have lots of parameters, as Google “typically prefer URLs with 1-2 parameters”. This means dynamic pages are Ok, but the site needs to be set up to make sure there aren’t excessive parameters. Also, the links to the unindexed pages may be causing the issue. These links should be easy to follow static text links, no Flash, JavaScript, AJAX, cookies, frames, etc. as all of these are difficult for the robots to index.

A question was asked about, what is the best way to set up sites with content in different languages? Should one use several TLDs (Top Level Domains) such as mysite.com, mysite.de, mysite.fr, etc. or subdomains; de.mysite.com, fr.mysite.com, or even keep the one TLD and simple extend the folder structure; mysite.com/de, mysite.com/fr? This question can be extended to setting up separate semi-related sites to your main domain, e.g. if you are selling new widgets on your main site, and decide to start selling second hand used widgets, but would like to advertise as a separate web site with links between the two, sub domains will be perfect. This is because, although the content will be small, Google will recognise it as part of the same TLD and not declare it as a portal or doorway page. Matt recommends starting with sub domains when you only have a small amount of information to publish for that particular language or section of the site. Once you have a substantial amount of content, move to separate domains.

It’s always useful to take note of the information Matt slips into his blog. There is more info on his site and we would strongly recommend all those responsible for the web site in your business take some time out and check his comments and those of his readers.

At Futuresys we always take time to review what’s happening in the world of search engine optimisation. We use this information to help our customers succeed without using black hat techniques which would ultimately get them knocked out of the search engine indexes. After all, your business depends on our ability to make you web site successful, our business depends on your web site being successful.