Because our team consists of a mixture of marketers, analysts and programmers, we debate almost daily about the merits and drawbacks of one type of web platform over another and whether or not really good PPC or off-page SEO can make up for poor development and design. Then we ran across this case-study:
… I have been recently working with a search marketer in the travel industry who has been trying to expand the focus of their site from one particular destination to a global travel portal. The regional site does well, generating between 60-100 clicks per day at an average CPC of only $.12 and a margin of about 30%.
This particular marketer realized if he could duplicate the same success on a global scale, he stands to generate huge revenues (and profits). His company created a brand-new design focused on a global market. Realizing that this could quickly become an enterprise app, they chose to rearchitect the site using .NET technologies and AJAX.
The new site is amazingly informative and far more usable than the old PHP site they were running on. Test users agreed that it was a big improvement over the old site.
However, Google didn’t.
We’d be really kidding ourselves if we said “so what” at this point… the fact with regard to any search engine marketing is if Google ain’t happy, ain’t nobody happy. Time and again, we’re presented with web designs or redesigns that developers claim are “designed for maximum conversions” yet all the text that pertains to their search terms is totally invisible to a spider unless you actively click on a page element – spiders do not click anything. At least in this case, they took advantage of the redesign opportunity to test their product.
The company ran three test campaigns, including an exact duplicate of their previous PPC campaign. The average CPC rose to $.30 and traffic declined to less than 10 clicks per day (across three cities, not just one). The new site is a complete bust. Regardless of the fact that it provides a great user experience, it doesn’t comply with Google’s ideas of what makes a great site, and so it doesn’t get the traffic.
- The average page size increased from 47k to 375k, an eight-fold increase.
- Because of the use of .NET, the new pages now contain nearly 85k of hidden binary code. This not only slows the page rendering down, but also dilutes the ratio of spider food the Googlebots are finding.
The latest and greatest in web technology isn’t always the best choice, especially if you rely on search engines for your traffic. At the end of the day, the job of any search engine is to provide relevant, useful results to searchers. Anything you do to your web site that makes it “harder” to use for the bots, will adversely affect your SEO and PPC efforts.