Why Does Your Coding Platform Matter?

Because our team consists of a mixture of marketers, analysts and programmers, we debate almost daily about the merits and drawbacks of one type of web platform over another and whether or not really good PPC or off-page SEO can make up for poor development and design. Then we ran across this case-study:

… I have been recently working with a search marketer in the travel industry who has been trying to expand the focus of their site from one particular destination to a global travel portal. The regional site does well, generating between 60-100 clicks per day at an average CPC of only $.12 and a margin of about 30%.

This particular marketer realized if he could duplicate the same success on a global scale, he stands to generate huge revenues (and profits). His company created a brand-new design focused on a global market. Realizing that this could quickly become an enterprise app, they chose to rearchitect the site using .NET technologies and AJAX.

The new site is amazingly informative and far more usable than the old PHP site they were running on. Test users agreed that it was a big improvement over the old site.

However, Google didn’t.

We’d be really kidding ourselves if we said “so what” at this point… the fact with regard to any search engine marketing is if Google ain’t happy, ain’t nobody happy. Time and again, we’re presented with web designs or redesigns that developers claim are “designed for maximum conversions” yet all the text that pertains to their search terms is totally invisible to a spider unless you actively click on a page element – spiders do not click anything. At least in this case, they took advantage of the redesign opportunity to test their product.

The company ran three test campaigns, including an exact duplicate of their previous PPC campaign. The average CPC rose to $.30 and traffic declined to less than 10 clicks per day (across three cities, not just one). The new site is a complete bust. Regardless of the fact that it provides a great user experience, it doesn’t comply with Google’s ideas of what makes a great site, and so it doesn’t get the traffic.

The gory details are even better… what many people seem to totally forget is that you have one shot with a bot. Either a spider can crawl your site or it can’t – it can’t enable Javascript and try it again. It can’t crank down the security and try again. It won’t wait patiently for all your code to load up – if it takes too long and moves on, you’ve missed your opportunity. Here’s a look at the initial damages that the new design racked up simply by trying to migrate to a more technically advanced platform:

  • The average page size increased from 47k to 375k, an eight-fold increase.
  • There is now a massive reliance on javascript to render the page correctly. Googlebots do not process javascript, so to test the effect of this on page rendering, we turned off javascript and rendered both the old and new sites. The old site rendered correctly, but some of the advanced search functionality no longer worked. This would obviously present no problem for a Googlebot. Not so for the new site … it no longer rendered most of the text. The area and hotel descriptions were still in the page source, but could not be viewed in the browser. This probably appears to a Googlebot to look like cloaking, a very bad SEO practice.
  • Because of the use of .NET, the new pages now contain nearly 85k of hidden binary code. This not only slows the page rendering down, but also dilutes the ratio of spider food the Googlebots are finding.

The latest and greatest in web technology isn’t always the best choice, especially if you rely on search engines for your traffic. At the end of the day, the job of any search engine is to provide relevant, useful results to searchers. Anything you do to your web site that makes it “harder” to use for the bots, will adversely affect your SEO and PPC efforts.