It’s your worst nightmare…
You wake up one morning and check your analytics. But something’s wrong…where’s all your traffic?
Whether you like it or not, websites in most niches rely on Google for a large percentage of their traffic.
If you get hit by a penalty, 10%, 20%, or even more of your business can be wiped out overnight. That’s a pretty scary thought.
There are two types of penalties that can hit you: manual penalties and algorithmic penalties.
Algorithms get most of the attention because those types of penalties affect tens of thousands of sites all at once.
However, there are over 400,000 manual penalties that are applied every month, according to Matt Cutts—that’s a lot.
To be fair, many of the sites that get penalized are legitimately awful sites that consist of nothing but content spam. However, hundreds of site owners are penalized every day who are trying to make the best site they can. It could even be you one day.
If you’ve been fortunate enough to avoid a penalty in the past, you might think reports of penalties are exaggerated. In most cases, they’re not.
While not all penalties will have the same effect on your traffic, some can wipe out 90% or more of it in an instant.
And penalties don’t discriminate either—they affect both small and large sites.
After the Panda 4.0 update (more on that later), eBay’s traffic was hit hard:
But that’s far from the only example of a big site being penalized.
Recently, another large company named Thumbtack was penalized.
Thumbtack, in case you didn’t know, is a company that Google invested $100 million into, and they still got penalized.
That being said, there is a difference between penalties for small and large sites. If you’re a very large site, where a penalty will garner a lot of press, you may be able to get prioritized support in fixing the penalty.
Thumbtack was able to get their penalty lifted in less than a week. If you have a lesser-known site, it’ll typically take a few weeks or months (at least) to correct the penalty.
I didn’t tell you all this to make you terrified of getting hit by a penalty. I did it so you recognize that avoiding penalties is ideal for your business.
If you understand all the different common penalties that Google hands out on a regular basis, you can take simple steps to reduce your chances of being hit by one by 99%.
In this article, I’m going to go over all the main types of penalties you can be hit by:
- Panda
- Penguin
- Mobile-Friendly
- Top Heavy
- Payday
- Pirate
- Unnatural Links
- Spam
- Thin Content
For each of the penalties, I’ll let you know if you have the type of website that is at risk of being hit and what steps you can take to minimize your chances of being penalized in the future.
If you’ve already been hit by one of these penalties, check out my step-by-step guide to fixing any Google penalty.
Panda – This penalty chews up weak content
The Panda algorithm might be the most well-known algorithm.
It was one of the first updates that specifically penalized websites. The first Panda algorithm was run in 2011 and decimated the traffic of a lot of low-quality websites.
In the three years following its release, Panda was run about once per month. Now that the algorithm is more established, it only seems to be run a few times per year.
While this might seem like a good thing at first, it’s a double-edged sword. On the one hand, with fewer updates, there are fewer opportunities to get penalized.
However, Panda is an algorithmic penalty. This means that if you get hit, once you fix the underlying issue(s) that caused the penalty, you have to wait for the algorithm to be run again to get your rankings back.
That means you could be waiting several months to get the penalty lifted.
And if you’re unsuccessful fixing the issues, you’ll have to try again and wait for another iteration of the algorithm.
The basics – What is Panda? The amazing thing about Panda is that even though it’s been run several times over the past four years or so, we still don’t have an exact definition of what types of sites it affects (although we have a good idea).
Google’s search team keep their algorithms as secret as possible. They don’t give much help to sites hit by algorithmic penalties, whereas they provide a lot of support for manual penalties.
As of now, we know that:
The purpose of the Panda algorithm update was and is to keep low-quality (“shallow”) content from showing up in search results.
Therefore, if you don’t have low-quality content on your site, you should be safe from the traffic-eating pandas.
Here is the problem, however. Low-quality can mean many different things.
Google provided a list of over 20 questions to help alleviate the worries of webmasters, but most of these are open to interpretation:
Two different people could be asked these questions regarding the same site and come to different conclusions. I don’t think they are very helpful.
Over time, the SEO community has come together to analyze websites that were hit by Panda and arrived to the following conclusions about pages that get penalized:
- The content is poorly written (perhaps “spun” using software)
- The content is very short (“shallow” content that is too brief to be valuable)
- The content is mostly duplicate content (copied from another page)
- The content adds no real value
It’s no surprise that content farms, like most web 2.0 sites, were hit the most. They were heavily used by SEOs to create backlinks to content, but those links were placed in terribly written, short articles for the most part.
How do Panda penalties work? Google often patents its algorithms, and it did so for Panda. It was granted its Panda patent in 2014. While you’re free to read it, it’s pretty boring, so let me sum it up for you:
Google creates a site-wide modification factor based on the quality of all the pieces of content on the site. If it falls below a certain threshold, the factor is applied to the site (lowering rankings of all the pages on the site).
In plain English, this means that if a site has a certain amount of low quality content on it, the entire site will be penalized.
That’s why, when it comes to reports of Panda penalties, you usually see graphs like this one:
Panda penalties are rarely small—they decimate organic search traffic.
How do you know if you were hit by Panda? You don’t get any messages about algorithmic penalties. The only way to spot them is by observation.
If you get hit by a penalty that wipes out most of your traffic, chances are you’re not alone. Monitor SEO news sites such as Search Engine Land to get more information. If it’s a Panda update, it’ll likely get spotted quickly.
If you ever suspect you’ve been hit by a penalty, but it happened in the past, there are online tools that can help you.
One useful free tool is the Panguin Tool. Once you connect it to your Google Analytics account, it will overlay a graph of your traffic over timelines of past algorithms:
If you see that your traffic rapidly declined a few days before or after a major Panda update, you were likely penalized by it.
Remember that these algorithms are often run over long periods of time (weeks), so your traffic decline may not start on the exact day that the algorithm was reported.
Penguin – The bird that can’t fly but can detect your bad backlinks
Only in SEO would a panda and a penguin be so closely related.
Both have had a huge impact on the way SEOs approach their work.
While Panda focused mainly on on-page factors, Penguin was a huge step forward for identifying unnatural link profiles.
The first Penguin was released in 2012 and affected over 3% of all queries. Like Panda, it decimated the traffic of any site it penalized:
What Penguin looks for: Penguin was groundbreaking when it was first run and has become more sophisticated over time.
It looks for a variety of obvious unnatural backlink patterns.
Google will never release the full details of the algorithm (or not any time soon), but we do know that there are three main backlink factors that can be used to identify unnatural link patterns:
- Link quality - A site that has obtained all of its links naturally will have links of both low and high quality. Sites made by blackhat SEOs often have a ton of just low quality links or only high authority links (like from a private blog network).
- Link velocity - Look at the backlink growth of any large site, and you will see that it gains links at an increased rate over time. Unnatural sites often get a lot of links in a short period, followed by a sudden decrease.
- Link diversity - Legitimate sites get links from all sources (contextual, blog comments, forums, etc.). However, bad SEOs often create a large portion of a site’s links from one source (like blog comments). In addition, links should have varied anchor text. Too many links with the same anchor text could trigger a Penguin penalty.
Complicated, right?
Penguin is one of the main reasons why most SEOs are “whitehat,” or at least “greyhat,” SEOs these days. If you want to manipulate Google, you’ll have to plan your link-building strategy very carefully to make sure that most of your links appear natural.
How Penguin penalizes sites: Penguin is not a site-wide penalty—it affects specific pages.
However, since it affects those pages that typically have the most backlinks pointing to them, you can still lose 80%+ of your traffic if those pages are responsible for most of your traffic.
If your site is flagged by Penguin, you’ll typically be penalized. In some rare cases, Penguin will discount the value of the unnatural links instead of penalizing you.
A tool such as Panguin (shown in the previous section) can confirm that your traffic drop was caused by a Penguin algorithm update.
If your traffic drop was relatively small, you were probably one of the lucky few who didn’t get penalized. The drop was most likely caused by those now-discounted links.
When you’re checking to see if you were hit by Penguin, you should know that it is an even bigger algorithm than Panda. It can take more than a few weeks to fully run.
Recovering from a Penguin penalty is possible but difficult. Not only will you have to try to fix the issue (which could be a number of different things), but you’ll also need to wait for the next algorithm refresh to see if it worked or not.
Mobilegeddon – Can Google force website owners into the future?
Google’s primary goal is to help users find the best content that satisfies their queries.
For the first decade of Internet search, most of the work done by Google was dedicated to finding and classifying content better.
But Google is pretty good at that now.
The biggest factor affecting the user experience (when someone is searching for something) is the content itself. In other words, website owners aren’t improving their websites and content fast enough to keep up.
In early 2015, Google announced that it would start trying to help mobile users find useful results on mobile-friendly websites.
This announcement caused a lot of stir in the SEO community. A mobile-friendly update was soon to come, and it sounded like it was something big.
Site owners scrambled to make their websites mobile-friendly—something that Google would be happy to see (better experience for mobile searchers).
The update finally came a few months later on April 20th.
Although it was called “Mobilegeddon” and “Mobilepocalypse,” it turned out to be much less significant than originally predicted.
There was definitely some movement in the search rankings, but only the worst mobile-offenders suffered traffic losses.
What does Google consider mobile-friendly? Mobile-friendly can mean many different things. This is probably why Google started by just demoting the worst offenders.
Right now, there’s no sliding scale. Your web pages are either friendly or not friendly.
You can see what Google thinks of your content by using the Mobile-Friendly Test tool. Enter a URL, click Analyze, and it will give you a green passing message or a red fail message.
It’s a good idea to check a few different pages such as your home page, a blog post, and any other pages with custom layouts or designs.
Another place to check if you have any major mobile issues is in Google Webmaster Tools (Search Console).
Navigate to “Search traffic > Mobile usability”, and you’ll see any errors that you should fix as soon as possible:
Finally, Google has also released a useful mobile SEO guide. In it, it explains the most common mobile errors such as blocking javascript or messing up your mobile redirects.
On top of those mistakes, here are a few more general mobile-friendly principles to keep in mind:
- Don’t use software that most mobile devices can’t render, e.g, Flash.
- Resize text to match the screen (i.e., responsive design)
- Use text that is easily readable on a small screen (typically 16px or more)
- Don’t put links right beside each other (hard to tap the right one)
Mobilegeddon in the future: Just because the first mobile-friendly update wasn’t huge doesn’t mean you shouldn’t concern yourself with making your website as mobile-friendly as possible.
Google will likely make changes to the algorithm in the future as it further develops its requirements for what is and isn’t mobile-friendly.
Keep in mind that even if you get hit by a mobile “penalty,” your traffic likely won’t be decimated. This update primarily boosts the rankings of the most mobile-friendly sites, so they’ll just push down your unfriendly pages in the results.
Top Heavy – Balance is the key to any impression
When a searcher clicks on a result in Google, they are looking for an answer to their query.
If they can’t find it, they get frustrated.
So, it makes sense that Google would like to minimize these frustrations by not sending users to sites that make it difficult for users to find what they’re looking for.
The “Top Heavy” algorithm was first run in January 2012.
As the name implies, it specifically targets top heavy sites.
The best explanation comes from Google itself:
“We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.
So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.
Such sites may not rank as highly going forward.”
How the Top Heavy penalty works: This is a site-based penalty. That means that either all of your content is penalized or none of it is.
Google clarified this after an article on Search Engine Land pointed out that Google’s results themselves could be seen as “top heavy.”
Google responded by saying that only sites where most pages are “top heavy” will be penalized.
If it’s only a few pages, don’t worry about this algorithm.
The final thing you need to know about this algorithmic penalty is that it is run very infrequently.
It was first run in January of 2012, then October of 2012, and most recently in February of 2014. If you get hit with this penalty, you’ll have to be patient to get it removed.
Avoiding a Top Heavy penalty: Although it may seem unfair that the algorithm is only run about once a year, it’s fairly difficult to get hit by this penalty.
Here’s an example of a top heavy layout:
Unless you have multiple ads, all above the fold, you’re probably safe.
And really, these types of sites should be penalized. They’re extremely frustrating to the average searcher.
If your content is pushed below the fold, chances are your site visitors won’t bother trying to find it.
To avoid this penalty, just create a good user experience.
Payday – If you prey on hopeful readers, your Payday may be over
Anyone who has been in the Internet marketing industry for some time knows that shady industries can be very lucrative.
Most of the best blackhat SEOs compete against each other to rank for keywords in the gambling, loan, and supplement niches.
This algorithm—“Payday”—was appropriately named for some of the most lucrative, and therefore competitive, search engine results for Payday loans.
Combatting spammy results with the Payday algorithm: We’ve seen in the past few years how good Google is at catching blackhat SEOs.
It has repeatedly crushed large portions of their sites, mainly belonging to beginner and intermediate SEOs.
However, the best blackhat SEOs won’t go down easy.
There is a small group of SEOs who have the ability and will to manipulate Google. They are good enough to rank well in these high paying niches and make enough money to justify it before getting penalized.
The Payday algorithm was first run on June 11, 2013, and rolled out over a few months.
It specifically targeted queries containing keywords such as:
- Payday loans
- Casinos
- Viagra
- Garcinia cambogia
- and more.
The second version of the algorithm was released on May 17th and 18th of 2014, and the 3.0 version was released soon after in June.
If you operate a site in any “spammy” niche, you need to be extra clean if you want to avoid being penalized. Otherwise, if you’re getting results with blackhat SEO, expect to be penalized eventually. If that happens, you’ll just have to move on to a new site.
If you have a legitimate site that was hit by this penalty (line up traffic drops with any of the algorithm dates), you can try to fix it. However, you’ll have to wait for the algorithm to be updated again for any positive changes to take effect.
Pirate – Outlaws be warned! The Google police are coming for you
Google almost always tries to show searchers the results they want.
However, Google has taken a strong stance on piracy.
Piracy, which is essentially stealing copyrighted content, is considered unethical by many and is illegal in some countries (although hard to enforce).
The “Pirate” algorithm was Google’s answer to the growing number of torrent sites (mainly used for pirating media and software) showing up in search results.
Based on the following graph of the traffic for some of the top torrent sites, I’d say it worked pretty well.
It didn’t knock them out of the search results altogether, but it reduced a large chunk of their traffic:
The reason why they still attract organic traffic is because not all their content is illegal material. In addition, this algorithm had no effect on branded searches.
Other sites that were purely made for pirating did lose most of their traffic. For example, free-tv-video-online.me lost 96% of its search visibility:
How the Pirate algorithm works: The main purpose of this algorithm wasn’t to eradicate torrent sites from the search results altogether, just for certain queries.
For example, if someone searched “Game of Thrones season 5 episode 6,” the searcher should not get torrent results. Before this update, torrent links to the episode would show up. But now, only reviews and legitimate ways to watch the show (HBO) are in the results:
The algorithm works based on copyright reports.
If a site has a lot of copyright violations, this algorithm will penalize it by lowering its rankings.
While new torrent sites can be made, they will be removed each time the algorithm is run if they have accumulated enough violations.
To get an idea of the scale on which copyright violations occur, consider this: Google receives requests to remove over 10 million URLs from search each week:
Not all of those are legitimate claims (Google always verifies first), but it’s still quite a bit.
If you want to avoid the Pirate penalty, it’s simple: don’t steal content (or I suppose don’t steal too much of it).
Unnatural links (manual) – Diversity is healthy
Manual penalties are a whole different beast when it comes to Google penalties.
They can be just as damaging to your traffic levels as algorithmic penalties are, but at least you’ll be able to see if you were hit by one.
As the name implies, manual penalties are given by Google employees and contractors who review your site against their quality guidelines and deem that you are violating one or more of them (most common ones are below):
One of the most influential ranking factors has been and still is backlinks. The more backlinks a page has, the better it ranks (in general).
Of course, SEOs started manipulating this as soon as they found out.
Manually reviewing backlink profiles of “unnatural links” is one of the ways Google combats this.
If the reviewer sees that a large portion of your links are paid links or part of a link scheme, you will be hit with this penalty.
Different forms of unnatural link penalties: Many different penalties include the phrase “unnatural links.” Some have more of an effect on your site than others.
If you log in to Webmaster Tools (Search Console), you can see whether you have any manual actions applied to your site:
The three most common actions are:
- “Unnatural links to your site—impacts links.” If you have unnatural links, but it doesn’t look like you had any part in creating them, you’ll get this manual action, which isn’t actually a penalty. The links will no longer factor into your rankings (so traffic might drop a bit), but there’s nothing you need to do to “recover.”
- “Unnatural links to your site.” If you just see this message, then you’ve been penalized. It means that the reviewer has concluded that you’re responsible for the shady links. Depending on the specific message, either specific pages will be penalized or your entire site could be.
- “Unnatural links from your site.” If you’re always linking to specific sites with exact anchor text (for a high volume keyword) or you have way too many links pointing out from your site, you could get hit with this. This penalty can affect either a portion or all of your site.
Fixing a manual penalty: While no penalty is good, manual penalties are better than algorithmic. Once you fix the issue, you can apply for reconsideration. If you truly fixed the problem, the manual action will be lifted.
Once again, you may need to refer to my step-by-step guide to fixing any Google penalty.
Spam (manual) – If you’re going to play around, at least do it carefully
While most SEOs believe that spam refers solely to blasting thousands of links to a site, it’s much more than that.
The term spam, at least when it comes to manual penalties, also includes things such as:
- excessive or malicious cloaking
- scraping content
- automatically generated content
- and more.
Just like in the case of unnatural links manual actions, there are many different spam-related messages that can show up as a result of a manual action. These are the most common:
- “Pure spam.” The majority of the site is clearly spam, or the backlinks to the site are all spammed. It’s next to impossible to recover from this manual action.
- “User-generated spam.” If you have a site that allows users to submit content, you could be penalized for it if they abuse it to create spam content or links. Most commonly, this penalty refers to spam in comments or forum posts/profiles. It can be fixed.
- “Spammy freehosts.” If you’re unlucky enough to have your site hosted by the same web host that provides service to a ton of spammers, your site might be lumped together with them. This is a good reason to stay away from very cheap or free hosting services.
Since these are manual penalties, they can be fixed. Recovery usually involves either cleaning up on-site spam or disavowing spammy links.
Thin content with no added value (manual) – No one likes hearing the same story over and over again
If Google doesn’t get you with Panda, it may get you with a manual review for having thin content.
Thin or duplicate content typically consists of information that can be found elsewhere, either on or off your site.
If a manual reviewer spots that most of your content is derived from other content, you can get hit with this penalty, and your traffic will take a tumble.
Here are the most common scenarios that represent “little or no added value”:
- Automatically generated content
- Thin affiliate pages
- Content from other sources, e.g., scraped content or low-quality guest blog posts
- Doorway pages
When you go to the Manual Actions section in Webmasters Tools (Search Console), you can see whether you’ve been hit by this penalty:
Pay close attention to whether it says that it’s a site-wide match or a partial match.
If it’s a site-wide match, that means the penalty applies to all your content until you fix it. If you just have a few pages of thin content, it’s possible that the penalty will only affect those. While you should still fix it, it won’t have a huge effect on your traffic.
Conclusion
Penalties are part of every SEO’s education.
Most are deserved, but some happen accidently. Understanding the root causes of penalties is the first step to preventing them from occurring and fixing them if you do get hit.
Once you have a good grasp on all the penalties, monitor Moz’s Google algorithm change log for any new ones so you can stay on top of them.
If you’ve discovered that you’ve been doing something that might get your website (or your client’s) penalized, stop it and correct it. Hopefully, you’ll catch it in time to avoid a penalty.
If you have any questions about these penalties, just let me know in a comment below.
Source Quick Sprout http://ift.tt/1Q8IUZu
ليست هناك تعليقات:
إرسال تعليق