Negative SEO is becoming a highly debated topic in the SEO community. Despite it’s recent trending status, the concept of using spammy tactics to sabotage a competitors rankings is nothing new. Ever since the dawn of SEO, marketers have been coming up with new ways to use Google’s algorithm to hurt their competition. In some ways, it seems like the more that Google does to improve their algorithm, the easier it is for SEOs to wreak havoc on competing sites. More rules means more opportunities for exploitation.
When Google started cracking down on low quality links, spammers started buying up hundreds of thousands of links and pointing them towards their competition. When Google decided to give more weight to user reviews, spammers saw this as another opportunity to cheat by creating tons of fake negative reviews for their competitors’ products and services.
For years, Google insisted that negative SEO was not possible, stating “There is nothing a competitor can do to harm your ranking or have your site removed from [their] index.” However, in 2003, Google changed their tune, saying “There is almost nothing a competitor can do to harm your ranking…” In March of 2012, Google updated their claim once more, with the following:
Google works hard to prevent other webmasters from being able to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don’t control the content of these pages.
Over the years Google’s stance shifted from “it’s not possible” to “it’s possible and we’re doing the best we can to prevent it.” So what does this mean for SEOs and business owners? Should we be worried? Is there anything we can do to safeguard our sites? The purpose of this article is to delve into the different types of negative SEO (that we know about) and discuss how to protect your site from a potential attack.
What is negative SEO?
SERP real estate is limited, making SEO a zero sum game. This means that each time you win, somebody else loses. The idea behind negative SEO is that if you do enough harm to your competitors’ sites, you could effectively push your site to the top of search results. To make matters worse, negative SEO is relatively cheap and easy to implement. Especially when compared to doing things the right way — things like developing quality content and providing a winning user experience.
As search engines continue to evolve, so does negative SEO. Over the years, we’ve seen spammers use a variety of shady tactics. Here are a few of the most common types of negative SEO:
Link spam – This is by far the most talked about method of negative SEO. In April of 2012, Google released the Penguin update which targeted sites using manipulative linking schemes to artificially boost search engine rankings. This update had widespread impacts across a large percentage of search queries. Many businesses disappeared from the SERPs overnight. It wasn’t long before spammers started pummeling competitor sites with low quality links to push them down in search results. some spammers were sending emails to sites that linked to a competitor site and requested that they remove Needless to say this created a lot of fear and paranoia in the SEO community.
Duplicate content spam – Before Penguin, there was Panda. In February of 2011, Google released a major update that targeted content farms and sites with thin and/or duplicate content. Spammers started scraping content from competitor sites and republishing it elsewhere on the web, so that Google would penalize the original publisher and in some cases, demote their site in search results.
Reputation spam – One of the easiest ways to damage a brand’s reputation is by submitting fake negative reviews or creating fake, branded social profiles. Google and other major search engines integrate user reviews into their ranking algorithms. In addition to hurting a sites SEO, negative reviews and spammy social profiles can also be damaging to a brand’s online reputation. Especially since the the more authoritative sites like Facebook and Yelp appear higher in search results, putting them front and center for branded searches.
Technical spam – There is a multitude of different ways you can spam a website from a technical standpoint. Security breaches, malware, evil bots and DDoS attacks have been around for while now. However, now that Google is starting to incorporate user experience signals into their ranking algorithm, it’s more important than ever to make sure your website is technically sound.
Has anyone ever proven that negative SEO is possible?
This is probably one of the most heavily debated topics in SEO. If you look at a lot of the industry polls, the consensus is pretty much split right down the middle. One camp argues that the threat of negtive SEO is real, while others firmly disagree. The problem is that neither side has stepped up with any real, conclusive evidence to support their claims. If negative SEO is possible, then we should be able to prove it. Conversely, if negative SEO is a myth, then we should be able to prove that as well. Bottom line, the SEO community needs hard data to put this argument to rest.
The problem with confirming or debunking negative SEO is that the term itself is a moving target. There might have been a time when it was possible to hurt a site’s rankings by scraping their content and publishing it on several other domains. But I highly doubt that this would work in 2015. The same goes for link spam. On the heels of the initial Penguin update, it might have been possible to demote a competitors site by sending them a bunch of low quality links. But now that Google is hip to jive, it’s going to take a lot more than link spam to make an impact on a site’s rankings.
Google is constantly updating their algorithm and blackhat SEOs are always updating their playbook to reap the benefits of Google’s latest vulnerabilities. The question isn’t so much whether negative SEO is possible, but more so which aspects of our website are most suseptible to spam.
I recently read an interesting case study from Bartosz Goralewicz, where he talked about a client that he suspected had been hit with some negative SEO. What was unique about this case is that it wasn’t the result link spam. Instead it turned out that the drop in rankings was due to a combination of non-DDoS server attacks click-through-rate (CTR) spam. I won’t go into all the details in this post, but I would highly recommend reading it because it was by far the most interesting negative SEO case study I’ve ever read.
As Bartosz pointed out in his post, user experience (UX) has become a strong ranking signal for Google. This was made abundantly clear in Searchmetrics’ latest ranking correlation factors study. According to the report, CTR had the highest correlation with higher search engine rankings, making it a prime target for spammers. What’s more, many other UX signals are relatively easy to manipulate, and virtually undetectable, unless you know specifically what to look for.
How can brands protect themselves from negative SEO?
As SEO becomes more competitive, we’ll start to see more people using less than honest tactics to claw their way to the top of search results. The scary part is that negative SEO is perfectly legal. If you search for “negative SEO” on Fiverr, you’ll see a variety of gigs offering up to 500,000 links for $5. I wouldn’t be surprised if we start seeing hitman-style SEO consultants for hire. To quote Anthony Burgess, “we all need money, but there are degrees of desperation.” Desperation can make people do some pretty evil things, so it can’t hurt to be prepared. Regardless how negative SEO changes in the coming years, there are some steps business owners and webmasters can take to safeguard their site from blackhat tactics.
Backlink profile – Keep an eye on your backlink profile. I would recommend using a tool like Open Site Explorer or Ahrefs. Some tools send you email alerts when new backlinks are discovered, like MonitorBacklinks.com. If you come across any suspicious link that appear to be spam, you should try to reaching out to the webmaster to see if they can remove it for you. If that doesn’t work, you can always use Google’s disavow tool to tell Google to ignore the links in question. Also, keep in mind that not all suspicious links are considered spam (i.e. links from scraper sites, site-wide/footer links, links from m.biz directories, etc.). Marie Haynes wrote an awesome post outlining some of these non-negative SEO links and provided detailed examples for each scenario.
Duplicate content – I would imagine Google is capable of determining the original source of content on the web. I write articles for a variety of different websites and many of them have been syndicated by other sites. I’ve personally never run into any issues here. However, it’s still a good idea to keep a close eye on your content to make sure that others aren’t stealing it. To do this, I would recommend using a service like Copyscape, that allows you to scan the web for plagiarized content. If someone has published your content without permission, then there are steps you can take to have the content removed from their site. The first step would be contacting the site owner and asking them to either remove the content or add link to the original source. If this doesn’t work, you can take legal action and issue the publisher a cease and desist letter. Hopefully it doesn’t come to that, but it’s good to know your rights as a publisher.
Brand mentions and reviews – I like to use brand monitoring software to track brand mentions and sentiment on the web. It’s not uncommon for jealous competitors to create fake negative reviews. Not only can fake reviews damage a brand’s online reputation, they can also cause your site to appear lower in search results — especially for local pack listings. There are variety of brand monitoring tools available. Social Mention is a popular free option. If you’re willing to spend some money, I’d recommend Brandwatch or Brand’s Eye. If you find any suspicious reviews, you can usually flag them as spam. If they’re not hidden or removed, you can usually reach out to the site and ask them to remove it. Keep in mind that some sites, like Yelp, have a strict policy on removing reviews, but in most cases you can contact the reviewer directly or respond publicly. Reputation management services aren’t cheap, so it’s best to stay on top of brand mentions and reviews as much as possible.
Technical factors – These are the factors that no one seems to be talking about. Maintaining the technical aspects of your site is one of the most important things you can do to improve your on-page SEO. This includes things like CTR, page load time, time on site, pages per session and bounce rate. Google Analytics is my go-to analyzing on-site user behavior. For off-site metrics, like organic impressions and CTR, I like to use Google Webmaster Tools. Although Google Analytics offers decent reporting for page speed analysis, I personally prefer Pingdom. Here you can analyze your page to evaluate site performance and more importantly, pinpoint which aspects of your site could be improved to increase page speed. It’s also a good idea to make sure that your content management system is as secure as possible. If someone has access to your site, they basically have free reign to do whatever they please.
Negative SEO has become a lot more complicated and difficult to detect. As Bartosz mentioned in his post, spammers may start using new negative SEO in conjunction with “old school” methods like link spam to create decoys. Since the more well-known methods have a distinct pattern, site owners (and some SEOs) may assume that their loss in rankings is due to link spam, when in reality, the problem goes much deeper. This is one of those scenarios where a good defense is the best offense. This is also a big reason why I’m against all the secrecy regarding negative SEO. I think it’s better for everything to be out in the open so that site owners can take the necessary precautions to protect themselves from a potential attack. Instead of pretending that negative SEO doesn’t exist, we need to consider the possibility that it does and figure out how blackhats are using it, so we can disarm these vulnerabilities as quickly as possible.