Posts

The Top 5 Reasons to Use Firefox

Ok, so this is a blatant attempt to jump on the “top X reasons” wagon, but something occured to me today as I was compiling some information for my colleagues – most people don’t use Firefox. They just don’t know why it’s useful. So let’s remedy that.

1. It’s not a Microsoft Product.
I’m not a Microsoft basher. I remember when I had to install the printer over and over and over again for every single program I ever put in my computer and I hated it. The consolidated operating system is the best thing to happen to personal comptuers. But let’s face it – because Microsoft is the big dog, it’s the target. The only reason Macs are considered “more secure” is because so few (and the wrong type of) people use them it’s not worth a rogue programmer’s time to write a Mac virus. The same is sort of true for Firefox but in a different way. It’s not universally adopted by big corporations so why pick on it? Plus, it’s open source – this means ANYONE can fix it if it breaks. There are literally thousands of people out there who have contributed to creating and upgrading Firefox – when a vulnerability shows up, there are dozens of fixes submitted within 24 hours. You can’t get that sort of fervent attention from a huge, red-tape constricted company like Microsoft.

2. It’s Open Source
There are lots of reasons to use open source products but the main one is: it’s FREE, hello.

3. It’s Innovative
You know that nifty thing you can do with the latest IE where you have TABS in the window for all the open sessions of IE. Guess what – Firefox had that first.

4. It’s NOT Part Of The Operating System
You can install, remove, upgrade or otherwise fool around with Firefox and you don’t have to worry about accidentally changing anything else on your Windows box. The program is not integrated into any other functions on your computer unless you come up with a way to do that yourself.

5. The Library of FREE Extensions and Add-ons
This is why I started this article – about once every three months I send out a company-wide email containing a list of the most useful Firefox add-ons that I’ve run across. Some of these little gadgets are just gadgets, like IP address readers, or weather updates in the status bar. Some of them, however, are invaluable in my day to day work with search engine marketing.

For the curious – here are the links to my faves:

Oh, I forgot the part where you just use Firefox to go find find these add-ons (at www.firefox.com) and generally just click on the big green “Add to Firefox” button, wait three seconds for the “install” button to become live, click that, and restart the browser – no traditional downloading required.

There ya go, reason number 6. Enjoy!

Why Does Your Coding Platform Matter?

Because our team consists of a mixture of marketers, analysts and programmers, we debate almost daily about the merits and drawbacks of one type of web platform over another and whether or not really good PPC or off-page SEO can make up for poor development and design. Then we ran across this case-study:

… I have been recently working with a search marketer in the travel industry who has been trying to expand the focus of their site from one particular destination to a global travel portal. The regional site does well, generating between 60-100 clicks per day at an average CPC of only $.12 and a margin of about 30%.

This particular marketer realized if he could duplicate the same success on a global scale, he stands to generate huge revenues (and profits). His company created a brand-new design focused on a global market. Realizing that this could quickly become an enterprise app, they chose to rearchitect the site using .NET technologies and AJAX.

The new site is amazingly informative and far more usable than the old PHP site they were running on. Test users agreed that it was a big improvement over the old site.

However, Google didn’t.

We’d be really kidding ourselves if we said “so what” at this point… the fact with regard to any search engine marketing is if Google ain’t happy, ain’t nobody happy. Time and again, we’re presented with web designs or redesigns that developers claim are “designed for maximum conversions” yet all the text that pertains to their search terms is totally invisible to a spider unless you actively click on a page element – spiders do not click anything. At least in this case, they took advantage of the redesign opportunity to test their product.

The company ran three test campaigns, including an exact duplicate of their previous PPC campaign. The average CPC rose to $.30 and traffic declined to less than 10 clicks per day (across three cities, not just one). The new site is a complete bust. Regardless of the fact that it provides a great user experience, it doesn’t comply with Google’s ideas of what makes a great site, and so it doesn’t get the traffic.

The gory details are even better… what many people seem to totally forget is that you have one shot with a bot. Either a spider can crawl your site or it can’t – it can’t enable Javascript and try it again. It can’t crank down the security and try again. It won’t wait patiently for all your code to load up – if it takes too long and moves on, you’ve missed your opportunity. Here’s a look at the initial damages that the new design racked up simply by trying to migrate to a more technically advanced platform:

  • The average page size increased from 47k to 375k, an eight-fold increase.
  • There is now a massive reliance on javascript to render the page correctly. Googlebots do not process javascript, so to test the effect of this on page rendering, we turned off javascript and rendered both the old and new sites. The old site rendered correctly, but some of the advanced search functionality no longer worked. This would obviously present no problem for a Googlebot. Not so for the new site … it no longer rendered most of the text. The area and hotel descriptions were still in the page source, but could not be viewed in the browser. This probably appears to a Googlebot to look like cloaking, a very bad SEO practice.
  • Because of the use of .NET, the new pages now contain nearly 85k of hidden binary code. This not only slows the page rendering down, but also dilutes the ratio of spider food the Googlebots are finding.

The latest and greatest in web technology isn’t always the best choice, especially if you rely on search engines for your traffic. At the end of the day, the job of any search engine is to provide relevant, useful results to searchers. Anything you do to your web site that makes it “harder” to use for the bots, will adversely affect your SEO and PPC efforts.