Posts about search engine optimization

What Makes a Quality Site According to Matt Cutts

The head of Google’s Search Spam team, Matt Cutts, was recently interviewed by Eric Enge on the subject of “What Makes a Quality Site.”  It’s a good quick read that should get you thinking about how you can improve your site and thus generate more traffic.  Written below is what I got from it.

We know that everybody with a website wants to rank higher in Google to generate additional traffic to their site.  And they want to accomplish the higher rankings as soon as possible and with the least amount of effort.  While in the past it has been quite easy to accomplish this by taking advantage of certain loopholes and ranking indicators, Google is working to make it more difficult.  And this upsets a lot of webmasters.

“Why should I have to work harder for something that was so easy in the past?”

The overarching reason for this is because Google’s search results serve up a lot of spammy, repetitive websites built to manipulate the results.  Pushing for sites to strive for quality is a way ‘to improve the quality of search results’.

What can we take away from this interview to help improve our sites?

  1. Rather than regurgitating the same content that already exists, work to bring additional value.
  2. Focus on what differentiates you.  Describe what makes you special.  This can be difficult for e-commerce sites that sell the same products as many other sites – get creative.  See number 1 above.
  3. Focus on distinguishing yourself in a niche area before going head-to-head with the big players.  If you follow the above steps, your unique content has a good chance at cracking the first page because Google values diversity in the search results.
  4. SEO isn’t really about ‘link building’.  Yes, acquiring links to your site helps achieve rankings but focus your efforts at a higher level.  Approach this at the PR/marketing level and you’ll avoid wasting time seeking out useless links.
  5. Infographics and other types of link bait might soon be devalued by Google.  This is yet another effort that has been used to garner a large quantity of links and thus used to manipulate search results.  While some/many infographics are useful and of high quality, there are, according to Google, too many that are unresearched and of low quality.
  6. Sites that have many entry pages, such as restaurants that have locations in many cities, have typically created pages for each city with virtually the same, redundant copy.  Cutts advises to cut the redundancy and replace with more unique content – even if its only 2 or 3 sentences.  Again, see number 1 above.
Creating content that will rank well will likely be less programmatic than it has been in the past.  While unique content won’t be easy for some sites, especially those with thousands of products, the sites that spend the time to be unique will likely benefit the most.

Google Penguin Update – Google+ On-Air Hangout

The Leverage Marketing SEO Team recently sat down for a round-table discussion on the topic of Google’s latest algorithm updates called Google Penguin.  Hear the team’s take on subjects such as what Google Penguin is, what constitutes over optimization of anchor text, what is considered spammy links, and learn about actions that can be taken to help your site regain rankings if you were negatively effected on or around April 24th.

The video above is the first of several videos on the subject.  All videos are listed within a playlist for easy viewing and additional videos will be added to cover additional topics and questions.

View a list of all videos on the subject of Google Penguin.

Hangout with the Leverage Marketing SEO Team!

Join us on June 12th at 1:00pm CDT for an On-Air Hangout on the subject of Google Penguin.  Our SEO team will be on hand to discuss the recent algorithm updates known as Penguin, what you should know about the updates, what you should do if the updates negatively effected your rankings, and what you should do to ensure that your site won’t be effected in the future.

Feel free to submit your questions before-hand and you may be invited to be a participant in the live hangout to ask your questions!

For anybody that can’t attend the hangout while its live, watch this space or the Leverage Marketing Google+ page as we’ll be posting an archive version of the hangout shortly after completion.

Our SEO clients fared well following Google Penguin

In the wake of the Google Penguin outcry, we have found, from our monitoring tools, that our clients have largely been either unaffected or have actually seen improvements as a result of Penguin. Although improvements are difficult to specifically attribute to the recent update, for some client sites we have seen notable improvements in impressions since April 24th and some improvements in rankings. The client site below saw a recent 27% increase in impressions.

GWT reports impressions and clicks for search queries

I attribute the stability and improvement in our clients’ impressions and traffic to our adherence to White Hat SEO practices and our dedication to staying ahead of the curve. After Penguin, it appears that sites that pursued rapid link building techniques, link schemes and other tactics that Thy Ta described in her post here as well as other spammy tactics have suffered the declines. Yet, in some cases, sites that have not pursued these tactics or have not pursued these tactics in many years have been affected.

In the face of constant changes in Google’s Algorithm, today’s White Hat techniques could become tomorrow’s spammy techniques and could earn a penalty in the future. In order to prevent vulnerability to these algorithmic changes, come back in two weeks for the next edition of the Leverage Lowdown for specific tactics that will help decrease your site’s susceptibility to future updates. These tactics are not only advocated by myself, but also vetted by the Senior Product Manager of Bing’s Webmaster Program and the go-to guy for Bing search, Duane Forrester.

In addition, the SEO team at Leverage Marketing Agency invites you to join us on a Live On-Air Hangout on Google+ on June 12th at 1:00 CST.  We’ll be available to field questions and discuss Google Penguin and any other SEO questions you may have.  We’ll provide a link to view the hangout from our blog but if you’d like to be an active participant, please add us to your Google+ circle and drop us a line to let us know you’d like to chat with us live.

Part I – Google Penguin: The Effects, The Uproar and How to Adjust

Google’s recent search algorithm update, codenamed Penguin, was stated to affect only 3.1% of search queries. However, the queries that were affected by the Penguin update were very important for many small businesses across the country. Many companies rely upon Google’s organic search traffic to gain customers and to sustain their businesses. When the Penguin update rolled out on April 24th, these companies were negatively impacted and experienced dropped rankings and a large drop in traffic. In response to this rapid change, over 1000 small business owners and site webmasters have rallied together to form a petition for Google to reverse the Penguin update. I am writing a 3-part series starting with this post on how to tell if your site was hit by Penguin and will continue in future posts on how our clients were affected at Leverage Marketing and how to recover from and protect against algorithm changes in the future.

2 ways to check if your site was affected by Google Penguin

Google Penguin was rolled out very quickly across the majority of Google’s worldwide iterations on or around April 24. A question that many site owners and webmasters are asking is “Was my site affected?”

There are a few ways to tell, to a high degree of certainty, if your site was affected. The best way to determine if your site was affected is to go straight to your impression data. This data reveals how many times your site appeared in Google’s search results. Organic search impression data is available in Google Webmaster Tools. If you see that on or around April 24th your traffic declined by a significant degree, (similar to the change pictured below) your site was most likely affected by the Penguin update. This rapid decline in impressions indicates that Google is serving your website in less optimal positions for numerous search term results. This gives your site a smaller chance to attract traffic from searches.

Checking Google Webmaster Tools for effects of Google Penguin

http://www.seoteky.com/google-penguin-investigation-know-if-you-were-hit-by-this-mega-update/

If you do not have access to Google Webmaster Tools, the second way you can check if you were affected by the Penguin update is in your Google Analytics data (or whichever traffic tracking tool you use) to see if your Google non-paid search traffic follows a similar sharp downward trend. In some cases, the decline in impressions and visits is alarming. Some site owners have claimed to have experienced up to 90% drops in traffic and widespread drop in rankings.

In upcoming posts, I will be discussing how our clients (at Leverage Marketing) were affected and how to recoup if you were affected by the Penguin update.

Remember, Matt Cutts and the team at Google designed the Penguin update to give sites that practice honest, white hat SEO techniques a fair shot for competitive rankings. More importantly, the Penguin update aims at reducing the amount of spam to improve the user experience.

On June 12, the SEO experts at Leverage Marketing Agency (including yours truly) will be hosting a Google+ hang-out on air to discuss the Penguin update and give business owners and webmasters (and whoever else is interested) a chance to ask questions that we will immediately address. Be sure to follow us on Facebook, Twitter, and Google+ for the link to our hang-out.

Bing Becoming More of a Search Contender with New Keyword Tool

At SMX West last week, Bing announced a whole set of new features now available in Bing Webmaster Tools, including one that I’m admittedly excited about, Bing Keyword Research Tool.

While its functionality will differ slightly from Google’s Keyword Tool, Bing’s will focus heavily on organic search volume with six months’ worth of historical data, rather than one. This is the first web interface tool of its kind for Microsoft, and it should hopefully provide a reliable alternative to Google, which we’ve come to rely upon probably too heavily.

This keyword research resource is still in beta, but so far, here are some of the features that are outshining Google’s comparable tool:

  • As previously mentioned, six months’ of data or a custom date range filter!
  • Search query volumes based on organic queries.
  • Both a language and a country filter unlike Google’s ‘Local’ versus ‘Global’ data segments.
  • Raw search volume numbers without rounding or averaging.

One aspect of the feature that’s less intuitive is the ability to perform research in ‘Strict’ mode. This option will show you search volumes for what we have come to know as “exact match.” Whereas, when the strict mode field is left unchecked, you will receive data for phrase match terms.

Another great feature is the ability to roll over the results to see advertising data. It reveals how much the average bid and resulting CPC must be in order to advertise in the MainLine (above organic results) and the SideBar (to the right of organic results). This is definitely more advertising insight than Google is currently offering.

Undoubtedly, Bing still has some glitches to work out. For instance, only being able to search on one key phrase at any given time and the inability to search multiple countries simultaneously. Both of which hinder a researcher’s pace substantially. Bing is also offering a very arbitrary graph showing query trends for the selected period. However, it’s refreshing to have another data tool that’s independent of Google and offers a unique perspective on the data as well.

Why Do My Competitors’ Sites Outrank Mine?

You feel like you have read and done everything SEO professionals have told you to do. Your site has great high quality content that is relevant, you have interesting white pages, info-graphics, and other content that people want to share, you launched a strong link building campaign, and continue to build relationships with other web sites and bloggers.

So then, “Why/how is it that my competitors are outranking my site for my keywords?”

Guess what, you’re not the only one with that question. In fact, many are starting to ask an even more specific question:  “How is it possible for competitors who practice black-hat SEO techniques to outrank my website?”

I’ve worked on various campaigns and can attest to the fact that I deal with this on a daily basis. And I understand it’s frustrating. Through my research, I discovered that the competitors who out rank my clients all have “paid links,” and not just one or two – we’re talking about thousands of paid links.

As my SEO director wrote in a previous blog post, Danny Sullivan, Matt Cutts, and Duane Forrester have all agreed that “buying links” is highly discouraged and if practiced, puts a site at major risk for consequences. This isn’t news if you’re familiar with the search world. However, after much analysis of many competitors’ links, I find that almost every single competitor has tons of “paid links” and their sites still have the top rankings.

How do I know it’s a “paid link” and not a legitimate editorial link?

Through much time, effort and the help of multiple tools, I spot checked over 100+ links individually for each competitor I analyzed. Typically, most SEO firms who practice this tactic will either own or work with a multitude of blog and site owners to post their clients’ domains with carefully selected and oft-repeated anchor text. Also, a lot of these SEO firms even include a link back to their site as well. If that doesn’t raise a red flag, I’m not sure what will.

As an example of the types of links I see, let’s say your client is in the fashion industry, and you look up their competitors back links and find a link to their site on a blog. Well, as it turns out, that blog is all about paleontology, the science and study of past geological periods. Yet, you find that the content, blog roll (or side navigation), or footer includes a domain link back to your competitor’s fashion website. It doesn’t seem too natural for a blog all about the study of rocks to be including a link back to a fashion site, does it?

So, let’s all ask this question again, how are those sites out ranking my site? Especially when paid links are considered a black hat technique and are frowned upon by the two major search engines and thus by just about every single SEO guru.

While, I firmly believe it is possible to compete with those types of sites and I am against “paid links,” I can’t help but notice that those sites have maintained top rankings. Paid links are still helping sites achieve top rankings.

That said, while “paid links” are leading these sites to top rankings now, I think these top rankings will be short-lived.

Recently, my team members wrote a great deal about upcoming changes for all of us in the search world.

If you missed it, the take away from what Matt Cutts said during a panel at SXSW 2012, is

“We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect.”

And on Google’s official blog:

“We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.”

Is it possible that those sites with paid links will have trouble in the future? I certainly think so, and this gives the chance for all of us to compete fairly for those rankings. So, if you’re thinking about maybe going rogue and partaking in the purchase of links, I would urge you to reconsider.

If you have any questions about the tools I used to analyze back links or have any other questions, drop me a line in our comment section or visit our Facebook site and Twitter page.

Is Your Site Ready? Google Is About to Drop the Hammer on Over Optimized Sites

In our last Leverage Lowdown our Search Director Matthew Hooks, who attended this year’s South by Southwest Interactive (SXSWi) here in Austin, recapped the most interesting SEO panel of the event.  The panel was all about ranking better in 2012 and was a chat with three of the biggest names in Search, Danny Sullivan, Google’s Matt Cutts, and Bing’s Duane Forrester.  Matt Cutts said something in that panel that captured the attention of many in the search world.  His exact quote was

“What about the people optimizing really hard and doing a lot of SEO. We don’t normally pre-announce changes but there is something we are working in the last few months and hope to release it in the next months or few weeks. We are trying to level the playing field a bit. All those people doing, for lack of a better word, over optimization or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect. We have several engineers on my team working on this right now.”  [emphasis supplied]

Audio at SXSWi (it’s about 1/3rd of the way in).

Matt Cutts’ remarks about Google’s ranking algorithm become all the more intriguing when you consider another recent comment from Cutts & co.  On their official blog at the very end of February, Google wrote:

“We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.”  [emphasis supplied]

Hm, what’s that rumbling sound…

Is That A Bear?!!

Around this time last year, the so-called Panda update really shook up the Search world.  Many thin content and affiliate sites saw their traffic drop tremendously, as Google demonstrated its seriousness about unique and quality content.  What we are seeing now are indications that Google will continue down that path and will be isolating other methods that people use to game the system.  Sites that can’t stand on their own two feet without their gaming techniques will most likely fall.  Is it too early to pick out the baby’s name and call this the Grizzly update?

What are the targets of this potentially major update?  Both of Google’s clues point to at least one technique, and that is the over-optimization of link anchor text.  In other words, if Google looks at your site, and the links pointing to your site, and 90% of them say KEYWORD1, Google will rightly ask how you got those links and why they are all so similar.  What is the probability that 90% of the people that link to you would only use that one variation of one word?  Even “KEYWORD1,” “KEYWORD2,” and “KEYWORD3” as your anchor text looks more natural than that.  A link profile with identical anchor text over and over is like listening to someone repeat the same few words over and over – it’s annoying, and it may incur Google’s wrath in 2012.  Google estimates that 16% of searches they see every day are brand new.  So a link profile with nearly uniform anchor text is not going to look natural.

How to Be A Happy Camper in the SERPs

So, you may be wondering, how do I go about linkbuilding and still have it look natural?  How can I be a happy camper in the SERPs (Search Engine Results Pages)?

Respect Your Environment

First of all, do remember that there is nothing unnatural about promoting your site – if you have a website with great content, of course you want people to look at it!  Google does not have a problem with linkbuilding per se.  What Google wants is for the links to your site to reflect the editorial intent of the site linking to you.  If someone thinks your site is awesome, they’ll be more likely to link to you with “awesome site” than “KEYWORD 1.”  Or with anchor text like, “My friend’s business,” “eco friendly company,” “great sale,” or even, “click here.”  Don’t try to mow down natural linking patterns with manipulative anchor text requests.  Pursue anchor text, but don’t raze the forest.

Clean Up Your Trash

If your link profile DOES contain too much of the same anchor text, now is the time to vary your linking patterns.  Link to different pages than you usually do, and link with branded terms or variations of your keyword, or even, “click here.”  And if you have a really egregious batch of bad links, clean them up!  Perhaps the #1 cause of bear attacks is leaving trash and food out – clean that stuff up and don’t feed the Google Grizzly!

Don’t Start Forest Fires

Is there a little batch of bad SEO going unattended on your site?  Maybe some duplicate content, maybe it’s just keyword stuffed?  Put it out!  Too many pages like that could spread to Google devaluing your site as a whole.  This advice works for cleaning up after Panda, and it’s also just good advice for preparing for any Google update – stamp out bad SEO on your site wherever it may be.

Don’t Forget The Marshmallows!

The final thing to remember for being a happy camper in the SERPs is to not just avoid producing bad or mediocre content, but instead actively produce good content!  Content people like to talk about, content that makes people want to gather around and share your site naturally.  Being a happy camper in the SERPs is not about roughing it in the extreme wilderness, although it may feel like it sometimes – the most important thing is to make your site a natural, fun, social site that people want to visit.  Follow our tips to avoid the dangers and have a better site than before.  Happy camping!

2012 SEM Trends & How They Affect Your Business

At the beginning of the year, several bloggers and SEOs made predictions for the top SEO, PPC, and Social Media Marketing trends we could expect for the New Year, and while several of them have been eerily prescient in their forecasts, some newcomers are on the rise! Our SEO experts weigh in on some of the most effective and popular new SEO trends as witnessed in the first quarter of 2012:

1.  Community Is The New Content

For years, SEOs and marketing specialists have been hounding us about the importance of original, relevant content, and while this is still crucial to a site’s search health, it’s no longer sufficient to simply place content on your site without optimizing it socially. We are seeing that it is becoming ever more important to engage in a two-way dialogue with your customer base – whether that’s providing them the option to comment on your blog and responding, or sharing your content in the social sphere, or developing communication on Facebook, Twitter, etc. Engaging with your community will build stronger brand loyalty, allowing you to quickly resolve any potential issues and keep your company in tune with what your customers want and need.

2.  Location, Location, Location!

We’ve been seeing geolocation begin to creep into the social sphere – Google’s integration of maps on the search page, Twitter’s optional location stamp on tweets, etc. More localization will not only boost visibility for smaller, local companies in the online sphere but it really tailors results to the searcher’s needs.

We’re only going to continue to see more location-based search integration with the rise of mobile search so making that information readily available on your site is crucial, as Google uses this information to identify your location and return your business for relevant searches.

3.  Rich Media

We’re thrilled to see the development of rich media ads and snippets in the search network as it opens up myriad new options for our clients and great opportunities for search. Earlier, we blogged about rich media in Yahoo! and Bing (MSN) search, which include the incorporation of video, images, maps, forms, and much more.

Many of the search engines have also expanded their rich snippet inclusions for search engine results pages; Google, for example, has begun to include reviews, music, links to other pages on a site, and much more.  This detailed blurb provides more information for the searcher which may help determine whether or not they opt to click through to your site! Between rich media ads and snippets, there are many new venues through which we can provide information and increase visibility for our clients.

4.  Page Load Speed

It’s become evident in the past few months that Google’s recent algorithm updates have come to include page load speed as a ranking factor. Recent data shows that delays as short as a half-second can impact business metrics, and although this shouldn’t scare business owners into sacrificing content or revamping their entire site, it is important to consider what elements of a page may lead to a slower load time and if any of these elements could be edited or removed.

Some examples of elements that lead to slower load times include flash, huge images, several plug-ins, etc. It’s also important to note that your site may load differently in different browsers (Firefox versus Chrome versus Internet Explorer) so testing is an important component. Why is Google now factoring in page load time? Because slower pages result in less user engagement – those searching online, especially on mobile, want information immediately if not sooner. Data shows that 47% of consumers expect a web page to load in 2 seconds or less, 40% of people abandon a website that takes more than 3 seconds to load, and even a 1 second delay in page response can result in a 7% reduction in conversions.

A slow page may just be the thing that makes visitors hit the back button and go to another site.

Clearly we’re seeing a continued move towards more social integration and the merging of multiple marketing initiatives. As mentioned earlier, the combination of tactics is ultimately the most effective way to see successes in online visibility – a well-rounded marketing campaign involving optimized ads, rich media ads, social media presence and engagement, clean site structure and HTML, geolocation optimization, a mobile presence, etc. will be the ultimate way to improve online presence and search traffic.

 

Sources:

http://blog.kissmetrics.com/loading-time/
http://www.seo-creare.co.uk/seo-blog/seo/geolocation-social-media.html
http://www.seomoz.org/blog/8-predictions-for-seo-in-2012
http://www.seoconsult.com/seoblog/social-media-optimisation/4-predictions-for-social-media-in-2011.html
http://blog.cunet.com/2012/01/seo-in-2012-the-top-11-things-you-need-to-know/

SXSW SEO Chat Recap

While there are plenty of panels at least broadly related to the world of SEM and Social Media at the Interactive portion of the 2012 SXSW Festival, there is only one panel that I marked as a ‘must see’ related to the SEO space.  That panel was Saturday’s “Dear Google & Bing: Help Me Rank Better!” with Danny Sullivan – editor of Search Engine Land, Matt Cutts – head of Google’s Webspam team, and Duane Forrester – Sr. Project Marketing Manager at Bing.  If you weren’t able to be a part of this panel, below is a quick recap of the bits that I found to be especially interesting/useful.

How do small businesses compete in the search results with large competitors spending thousands of dollars and spending huge numbers of man-hours on SEO?

Cutts pointed to the fact that Google wants to even the playing field somewhat and either has and will continue to address this topic or that they will be addressing this topic with upcoming updates to their algorithms.  Duane responded by saying that social indications were quite useful and important indicators of a site’s value.  And that it’s hard to argue with a site’s/company’s value if other people are indicating the site’s value.  My take:  In addition to agreeing that social cues are very important, I would add that user experience on the pages of the site plays an important role as well.  Especially when working with new clients that are not yet well established in the search engines, we often see that after our initial optimizations, our client’s rankings jump from nowhere-to-be-found to the first or second page of results rather quickly – even for highly competitive terms.  The staying power of the rankings, however, appears tied to how well searchers interact with the newly ranking site.  When the page new to ranking is highly relevant and user friendly, it tends to remain ranked well but if the page is not user friendly or is not providing something that other ranking sites offer, the page’s ranking falls significantly.  The search engines appear to test a site’s validity for ranking well before committing any sort of long-term high rankings.

Can buying links really help a site get ahead in the search results?

All three panelists agreed that buying links is a huge no-no that could have drastic consequences.  My take:  This, of course isn’t news.  This has been the ‘word on the street’ for years and has caused some SEOs to change their tactics to be in compliance.  Not so long ago, the SEO company hired by JC Penney was very publicly outed as having bought links on a large scale and there were consequences for JCP.  However, I can personally attest to the fact that paid links have worked in the past and that the sites that benefitted from them then are still benefitting from them now.  And we see that many of the top ranking sites for terms that our clients are seeking top rankings for are acquiring paid links on a continual basis.  They keep purchasing links and they continue to rank very well.  While we can compete without buying links, the sites that purchase links benefit more than Google and Bing like to admit.  Sure, in the long run, it’s probably not a good idea to throw money at links (which is a big part of why we avoid this practice) but it’s hard to prove that buying links isn’t at least somewhat beneficial in the short-term.

Follow your SEO’s advice and redirect error pages to valid pages of your site

All three panelists again agreed and were emphatic about recommending that error pages not be left untended to and that 301 redirects should be put in place.  My take:   I couldn’t agree more and I’d love to have the responses to this topic from all three panelists in a video that I could send to all of our clients who aren’t willing to push (generally) simple redirects through.  I think we on the SEO team at LMA are great about pushing for permanent redirects when necessary but if a client isn’t willing or able to implement redirects on their own, we need to push more for the second best solution which was pointed out by Matt Cutts:  using the canonical tag in a page’s meta data.  Placing a tag indicating which version of a page the search engines should pay attention to is a recommended practice especially for sites that do not have the ability, for whatever reason, to implement 301 redirects on their site.

This was a great panel with three well humored panelists that provided practical advice for common questions in the SEO space.  While there wasn’t anything particularly novel about the content covered, there were some great takeaways.  We’d love to hear from you if you attended or if you have any particular questions about SEO!

Page 8 of 10«...678910