Remediate.Co
What is Negative SEO, and How to Defend Yourself

What is Negative SEO, and How to Defend Yourself

Last Edited May 31, 2018 by Garenne Bigby in Search Engine Optimization

In today’s business environment, one of the most important, if not the most important, advertising channels is the Internet. Getting a business’ website and its products and/or services online and into the viewing eye is crucial for profitability. To achieve this, web developers perform Search Engine Optimization—SEO for short—to raise a website’s rank on the results page of search engines.

For the most part, business owners seek to have their own websites increase in rank; this is considered “positive” SEO. There is, however, a dark side to SEO: utilizing the inverse of SEO tactics and practices to lower the rankings of websites. This is what is considered “negative” SEO, and it is one of the most insidious problems facing the Internet – and those who do business online to date.

Hopefully, any who reads this article—or anyone on the Internet—never falls victim to a negative SEO campaign. If, however, one is on the receiving end of such an attack or even thinks they may be, there are options available to defend against it. Over the course of this article, we will explain the mindset and tactics of negative SEO, illustrate the forms it takes, and provide defensive options.

What is Negative SEO?

As mentioned earlier, negative SEO is a collection of actions meant to lower a website’s rankings in the search engine results pages, hereafter abbreviated as SERPs. Generally, negative SEO is considered to fall in the domain of “black hat” hacking, as its intent is most often malicious. This is not always the case, however; web developers can sometimes adversely affect a site without meaning to do so. For the purposes of this article, we will be considering only those activities performed intentionally.

The motivations of those who carry out negative SEO attacks can vary. If an attack is made against a business, the likely suspects are its competitors, who would benefit from the business taking a tumble in the SERPs. Monetary gain is not the only motivator, however; unethical web developers can perform an attack to make a political statement, or to further a personal cause. It can be that they have a grudge against the target business, or—perhaps the most alarming reason of all—simply out of sadistic pleasure.

The notion of negative SEO arose out of a change in search ranking policy launched by Google. Prior to April 2012, there was no penalty for SEO specialists and businesses who purchased numerous links to boost a site’s ranking. Any link that Google considered “spammy” was ignored, and the linked site’s ranking was unaffected. When April came, Google’s Penguin update changed the game completely. After this update, any site with spammy links was severely penalized, thus upending the entire ranking paradigm.

While this has led to fairer search results, along with a higher quality of those results, it created an opportunity for those with malicious intent. Negative SEO began when black-hat web developers started building spammy links to their rivals/targets, and from there, the toolbox of negative SEO began to expand.


Schools of Negative SEO Tactics

Today, there are two schools of thought in how negative SEO is performed. While the sheer number of specific tactics used by black-hat SEO experts is too high to list here, they all ultimately stem from one of these two. Negative SEO attacks and campaigns generally break down as either off-page or on-page, though a negative SEO campaign can make use of both.

Off-Page

Negative off-page SEO is activity that targets a site without directly interfering with it. This type of negative SEO is most often performed by manipulating backlinks or copying a site’s content. Other options include claims of copyright infringement or plagiarism, excessive crawling, leaving false reviews or comments, and fraudulent site activity. In a nutshell, off-page tactics focus on a site’s reputation, not its material.

On-Page

Negative on-page SEO is the exact opposite of off-page: it involves hacking into a site and tampering with it directly. Once a black-hat user hacks into a site, they can alter the page content, reprogram its robots.txt file, manipulate its redirects, and even get the site de-indexed. On-page attacks are more difficult to execute but can be devastating to a site.


Types of Negative Off-Page SEO

Link Farms

Originally, this was a technique for bolstering a site’s search rank, before the Penguin update. At its core, a link farm is a series of interconnected websites; each site links to every other site in the farm. This had the effect of improving link popularity, which led to many site owners purchasing links from farms. When the Penguin algorithm came online, Google was able to spot farm-owned backlinks and dealt out penalties for using them. It was at this point that black-hat SEO specialists saw the opportunity to repurpose link farms. In all likelihood, the first negative SEO attacks centered around link farm use.

Copyright Complaints

Just because a black-hat operator is not the most technically adept of his cadre does not mean that he isn’t dangerous. One technique that does not require a great deal of technical skill is to make a false claim of copyright infringement—namely, that the target does not own the content they posted. This kind of attack can be supplemented by copying a site’s content and posting it to another site; more savvy attackers often do so as soon as content is made available. Even though it is simple in principle, it can be insidious if the copied content gets indexed before the original content.

Worse still, such a claim causes Google to suspend a site for 10 days or more to investigate the claim—which is often enough for the attacker. Simply being unavailable in searches will cause a site’s ranking to drop. If multiple attacks occur in sequence, a business’ ranking can implode, causing catastrophic loss of revenue. This kind of attack is not solely limited to the SEO domain. This methodology is the cornerstone of “patent trolling,” where lesser-known businesses cry foul against successful ones by stating that the successful product is a stolen version of theirs.

Recently, this has been illustrated by one video-game company’s baseless claim that a more successful console should be banned simply because it works in a similar manner to their product. Before posting content, users should always make sure it is original, to deter this kind of attack.

Scrapers

This type of attack is related to false copyright claims, in that it involves duplicating existing content. For scraping, however, the aim is to disseminate the copied content to multiple sites – as many as the scraper can manage. This tactic exploits Google’s Panda update, which was designed to combat duplicated content. When Google detects the same content on multiple websites, it only ranks one version. If the copied—scraped— content is detected by Google before the original, the scraped content is the one that gets ranked, and the original is left out in the cold.

Fake Reviews

Another low-tech tactic in the black-hat SEO operator’s playbook, this type of attack is self-explanatory. It simply involves posting false negative comments about a business’ products or services. If a website has a section for commentary, this type of attacker can also be found here, relentlessly posting baseless negativity without end. A fake-review attack is dangerous because Google may not immediately track it; unless a site owner reports potential abuse, Google’s hands are tied. Since reviews are so easy to generate, they are just as easy to manipulate. As such, this type of attack is often the first step in a competitor’s campaign. One bad review will not have an effect, but a wave of negative comments can affect both search rankings and a business’ overall reputation.

Heavy Crawling

For more tech-savvy—and desperate—competitors, one option is to try and crash a site with excessive crawling. By forcefully and repeatedly crawling a site, a black-hat web developer can increase the load on the server, which affects the site’s performance. Taken to excess, this attack can decrease a site’s speed, or crash it altogether. In the case of a crash, search engines lose access to a site until the crash is resolved. One crash may not cause significant harm, but if a site crashes multiple times, then it could be deranked. While this kind of attack is certainly devastating, it is more labor-intensive than other types of off-page tactics. Furthermore, it is more dependent on the skill of the attacker. Although this type of attack may not be a blackhat’s first trick, users should still be aware of it.

Click Fraud

This type of attack is an issue of contention in the SEO world. Some are unconvinced that clicks affect rankings, while others point to experiments that prove clicks’ effect on search ranks. In any case, a click fraud attack can have dire effects on a business, especially if it uses paid search advertising. This type of attack simply involves searching specific keywords—the target’s known or suspected keywords—and clicking on the website. Then, upon arrival, the attacker returns to the search results immediately, as though he were a legitimate user who went to a site he didn’t like and then repeats the process.
Taken to an extreme, this activity can result in a site dropping in the SERPs. If the site uses paid search, the effect is even direr. Since paid search bills a user for every click, the model assumes that most clicks lead to purchases. In this way, click fraud attacks can increase a business’ advertising costs, and eventually destroy profit margins.


Types of Negative On-Page SEO

Altering Content

For black-hats that have more programming skill, one of the first tricks in the hacking playbook is messing with a website’s content. This type of attack can cover a wide range of activity. The first things that might spring to mind would be rewriting page paragraphs, including foul language or offensive images, or posting humiliating videos. These kinds of alterations are not on a savvy hacker’s mind, however; large, grandiose changes are easily detected.

To achieve maximum damage, the changes need to be both effective and subtle. There are ways of hiding spammy links and content on a website so that they are only visible in the code. Threats like this are best detected with website auditing tools, to spot any unplanned changes to a site.

Getting Sites De-indexed

Another target for hackers is a site’s robots.txt. This is the program that, among other things, tells search engines like Google how to index websites. Manipulating this program is an incredibly subtle way for hackers to affect a website. Since users let the robots.txt file run in the background, they often forget about it entirely. In cases like these, all that a black-hat operator need do is insert a disallow rule into the file. That minor change is all it takes for robots.txt to instruct a search engine to ignore entire pages or even the site in its entirety. This kind of attack can be incredibly destructive to a business, in terms of both search rankings and customer awareness, if the owner does not realize.

When facing a potential attack of this type, vigilance is a user’s best friend. Making regular checks—manual or automatic—of a site’s rank can show if a site has been dropped from a search engine. Seeing results like this for one or two keywords is worrisome but does not necessarily mean an attack is underway. If, however, it occurs for a large number of keywords, one should consider checking a site’s robots.txt for any alterations.

Modifying Redirects

The previous types of attacks mentioned are risks for any site, but this type of attack is more geared towards established sites. If a site boasts a good reputation, with popular links and high authority, black-hats can attempt to hijack a site’s redirects to lead to their site. With this tactic, unscrupulous users can kill two birds with one stone: harming their rival’s site while boosting their own. By hacking a site and modifying its redirects, malicious users can increase their own traffic.

Furthermore, if Google spots the change before a site owner, they can impose a penalty for a malicious redirect. Since this kind of attack only benefits an attacker if the redirects are popular ones, one can see why only established sites are at risk here. In theory, any site could be targeted by this kind of attack, but if the redirects are of poor quality, the attacker ends up harming himself. In any case, regular site auditing can reveal new redirects on a site; if any of these are malicious in nature, the sooner a user knows about them, the better. As with most SEO attacks, awareness is the best defense.

Hacking a Site

Just because a black-hat operator does not have an SEO-specific endgame does not mean they cannot affect a site’s SEO. Search engines seek to protect their users from viruses, malware, spyware, and any other malicious code and take a dim view of sites that propagate these, intentionally or not. If a site gets hacked, and a search engine like Google realizes it, they can de-rank the site, or tack a “hacked site” warning onto the SERPs. Either of these will adversely affect a site’s ranking.
Worse still, if a hacker does manage to install malware on a site that can expose every user who visits the site. Because this is often the most sophisticated and damaging type of attack, defending against it requires extensive preparation. Defensive options include maintaining software updates, being wary of SQL, XSS, and error message text, and using HTTPS.

Changing the Robot

Altering a site’s robots.txt file has wider implications than just de-indexing. The Robots Exclusion Protocol—the formal name for the robots.txt file—is what tells web crawlers and other web robots how to use a site. One malicious alteration of the protocol is the de-indexing example mentioned above, but there are other possibilities. Rather than disallowing an entire site, hackers could do the reverse and open an entire site to robots, including pages that are under construction. This can expose content that a user may not have been ready to fully reveal, thus allowing the hacker to copy and post it earlier.

If a hacker is very well-versed in modifying the Protocol, they can potentially override a user’s instructions for specific web robots. While reprogramming the Protocol on its own may not have direct effects on a site’s SEO, it can open the door for a number of other attacks. This is why maintaining awareness of every file a website uses is so crucial. If a user spots a change in the robots.txt file, it may be a prelude to an attack or a sign that one is underway.


Preventing Negative SEO

It has been wisely said that an ounce of prevention is worth a pound of cure. Since it is easier and ultimately more cost-effective to avoid an SEO attack, users should take as many precautions as possible. To that end, some of the best preventive actions are listed below.

Setup Email Alerts with Google Webmaster Tools

Google wants to deter negative SEO attacks as much as any user, if not more so. To help users help themselves, Google provides a number of tools to monitor site activity. Google can alert users to the following:

  • Their site is under attack by malware
  • A site’s pages are not indexed
  • The server has connectivity problems
  • Google has imposed a manual penalty

In order to make use of these tools, however, users need to set them up manually. Doing this will help foster the situational awareness needed to spot an SEO attack in its infancy, and then stop it.

Track Your Backlinks Profile

This is, by far, the best investment of a user’s awareness in terms of halting attacks. Since backlinks and redirects are the most common line of attack by spammers, keeping aware of a site’s backlink profile can halt many threats before they start. There are many tools available to help with this—too many to list in this article—but users should consider tools that can alert them if backlinks are gained or lost. Performing manual checks of one’s backlink profile is another excellent practice. Doing so will help maintain the awareness needed to spot attacks.

Safeguard Your Best Backlinks

Often times, a site will obtain one or two backlinks that are a cut above the rest—that direct more traffic to a site than any other. These are prime targets for black-hats; their tactic of choice is contacting the webmaster with the site owner’s name to request a removal of the backlink. This kind of attack can be avoided with a few simple steps. First, users should confer with the webmaster of a prime backlink and set out what a real removal request will look like.

Users should specify that a request will only come from one email address. Smart users might advise that the request will include a specific phrase—something that nobody else would know to say. It is also a good practice to organize all of one’s best backlinks together, to check them all at once. This way, it is easier to spot any changes that a user will need to ask a webmaster about.

Secure Your Site

With malware—and the hackers that insert it—being one of the gravest of online threats, securing a site from unauthorized access is a top priority. Depending on what type of host service a user has, there may be a number of add-ons or plugins available to bolster the security of an administrator’s login. Ideally, one should have something akin to the Google Authenticator Plugin, which requires a random code sent to a smartphone in addition to a username and password.

Another option is having strong passwords with numbers, symbols, and specific spelling that hackers are unlikely to guess. Having regular backups of one’s files and database can help minimize damage from an attack. Finally, if a site allows uploads, one should confer with the host about how to install antivirus software.

Check for Copycats

As we have discussed, content duplication is another ready-to-go tool in the black-hat’s toolkit. Given how rapidly scraped content can be disseminated, users would be hard-pressed to track down every unauthorized posting by themselves. Fortunately, there are services in place that can readily handle this kind of electronic legwork. One such service is offered by Copyscape; all a user need do is insert either a URL or a batch of text, and Copyscape checks it against the entire web.
To use this service, users must create a profile and purchase credits, but this is hardly prohibitive. Copyscape’s fees are very reasonable, especially for the service provided. When facing an electronic copycat, time is of the essence; the sooner one detects the forgery and alerts Google, the better off they will be.

Monitor Social Media

Sometimes black-hats go one step beyond duplicating content and instead create fake accounts on social media with another user’s brand name. This kind of attack can become a problem very quickly if the faker gains enough followers. In addition to monitoring site activity, users should monitor their brand through social media. There are a number of tools available to help with this; users will need to create an alert that searches social media regularly for their name. If an alert detects a fraudulent account, users should report it as soon as possible. Even one faker can give a user a bad reputation.

Watch Site Speed

It is impossible for any site to maintain the highest possible speed indefinitely. That said, if a site’s loading time suddenly increases beyond the norm, it may be due to an increase in server requests. This can signify that spammers are attempting to shut down a site by knocking out the server. To combat this, users must be able to monitor their server uptime and loading time, with whatever tools are available to show when a site is down. At this point, users will need to contact their host, alert them to the situation, and request assistance.

Don’t be Your Own Worst Enemy

Although our focus here is on intentional actions by hostile operators, we will briefly outline how users can hurt their own site ranking. Users new to SEO can end up shooting themselves in the foot by using practices that Google frowns upon. These include using links to penalized sites, buying links, publishing excessive low-quality guest posts, and using too many “money keywords.” If a user is new to SEO, it can be easy to make these kinds of mistakes. The best way to avoid this kind of problem is to either hire an SEO expert or take a training course in SEO.

Don’t Make Enemies

This point should go without saying, but the easiest way to avoid negative SEO is to avoid making enemies online. Profit is only one of the three main reasons spammers attack a site; revenge and sadistic joy round out the other two spots. When conducting themselves online, users should be courteous and respectful. Arguments should be avoided at all costs; one never knows exactly who is on the other side of the screen. If one has a choice between avoiding a comment-war and upsetting a possible psychopath with SEO capabilities, one would think the choice is obvious.


Counteracting Negative SEO

Despite one’s best efforts, one can still be the recipient of negative SEO. When an attack is underway, the following actions can repair or mitigate the damage.

List the Backlinks to Remove

By checking a site’s backlink profile regularly, a user can not only see the growth patterns of the site’s profile, but also which links have been acquired. This way, when that profile suddenly takes a dip, users can tell which links had been acquired just before that happened. In cases like this, the first step for users is to make a list of all the potential bad backlinks and go through each one by one. If the links look spammy, then they need to be removed.

Try to Remove Bad Links

Once bad links have been identified, users will need to get rid of them. The first step in this process is to contact the webmaster of the other site and ask that the link be removed. This can be tricky if the webmaster has no listed contact information. Users may need to use a service such as Whois to find them. If, after contacting the webmaster, users get no response, the next step is to contact the hosting service of the other site. Web hosts want to curtail spam activity as much as anyone, if not more. They are often cooperative and can contact the webmaster on a user’s behalf.

Disavow Lists

If neither a user nor the host service has any luck, the absolute last resort is to disavow the offending backlink. Google’s Disavow Tool is very effective for dealing with links that cause penalties. Once a user makes a list of bad backlinks that he can’t remove on his own, he can send it to Google via the Webmaster Tools. After review, Google will disavow the spammy links, removing the effect they had on a site’s ranking. Again, this is only a last resort, to be used only when the offending site’s webmaster is uncooperative.

We hope that the preceding points will help inform users about the nature of negative SEO campaigns, and what can be done about them. As mentioned before, this has only been an overview; users can and should perform their own research, in order to be fully aware of existing threats. Awareness is a user’s best defense against negative SEO or any type of online attack. It is our hope that this information be used as little as possible, and that users stay safe online.

Garenne Bigby
Author: Garenne BigbyWebsite: http://garennebigby.com
Founder of DYNO Mapper and Former Advisory Committee Representative at the W3C.

Back
Remediate.Co

Related Articles

Create Visual Sitemaps

Create, edit, customize, and share visual sitemaps integrated with Google Analytics for easy discovery, planning, and collaboration.

Remediate.Co

Popular Tags

Search Engine Optimization SEO Accessibility Testing Create Sitemaps Sitemaps UX User Experience Sitemap Generator Content Audit Visual Sitemap Generator
Create Interactive Visual Sitemaps

Discovery has never been easier.

Sign up today!