WHEN LINK AUDITS GO WRONG!

Since the launch of Penguin and over the last year it has become fashionable to conduct link audits on a domain with the goal to ascertain and remove the toxic and harmful links. Until Penguin the world at large was blissfully unaware that bad links would hurt ranks, that worldview was largely supported by Google who espoused the general propaganda that they simply ignored bad links, which nicely fitted the worldview of some rather gun-ho SEO ‘experts’ who couldn’t care less about the scary mess they created in their wake and the theory went like this, if bad links were harmful, anyone could harm your ranks with negative SEO, so it cant be true!

Alas, when Google broke the spell they had cast upon the SEO world there was uproar, for company bosses started getting notifications telling them their SEO campaign manager was directly responsible for their domains being removed from the search results and heads started to roll. Then the whole negative SEO thing raised its head again and Google eventually had to address the issue down the line with their launch of the Disavow tool.

This put the whole Link Audit concept at the forefront of the search industry and conducting Link Audits transgressed from once largely being considered a waste of time, into a valuable ‘new’ skill which commanded specialist knowledge and insight. However, there was still a few hurdles to overcome, as then came a little reluctant group of ‘wedontlikechange’ vocalists who spent their time bemoaning the whole link audit concept as an out of hand scam being promoted by a group of snake oil stalkers who were preying upon others grief (for which there was some truth).

This is because there was indeed a bunch who were once largely responsible for acquiring and placing such links and they knew exactly where some of the problems lied because they had actually created them. They then jumped on this bandwagon and purported to offer link audit services and claimed to have such skills to analyze back link profiles and carefully find those toxic links, these thrill seekers worked in tandem with other dare devils who’d sell the whole ‘we can get your links removed for 5 pound a month’ crew.

After some further ado it became clear that this was a bunch of entrepreneurial directory owners whom having recently been banned from Google, were no longer able to sell links, but realized there was a quick buck to be had by charging those whom were negatively affected to now remove each link they had once been paid to put up, this lot had now banded together to ‘help’ companies with this problem!

The scariest of all these changes though has been the articles on the respectable blogs where they advise you on ‘HOW YOU CAN CARRY OUT A FULL BACK LINK AUDIT!‘, there’s been a ton of these over the last year and they seem to be copying and pasting from the first diatribe ever posted on the subject which royally painted all the wrong colors in the numbered boxes. If it was a straw house you were looking to build these surely are the architects for the job.

Now I’ve got to say this, there are some who are qualified to discuss this subject with more weight than others, namely myself not least, and why? because I’ve been building link auditing tools for over 5 years, these tools I built had their origins in penalties that I was seeing in the field, where companies would fall out of the SERP’s ‘for no apparent reason’ and where the bosses were fed the impression there was some wild update at Google that was responsible, this was when the whole black art of SEO was often kept hidden from the business owners view and where SEO companies worked in a cloak and dagger environment often not revealing their wonderful work.

When things went wrong in those days it was Google which got bad press, the losers would all hit the forums in droves and proclaim that Google’s days were numbered, that the search results were now so bad (with their site missing) that no one would use the super mega scary monopoly anymore. However, since that time the world has continued to flock to Google and the vast majority of SEO firms have seen their victims abandon them with as much flux as was seen in the search results by their former clients.

Most former SEO companies have even changed industries, to flee the sinking ships in their wake, ONE now likes to refer to themselves as InBOUND Marketers, such is the shame associated with their former activities, what with the name change there has been a game change where them WHITE HATs have been dug out of the closets (again) and theres been some leapfrogging onto the now well known phrase GUEST BLOG POST and the infamous INFO GRAPHICs while everyone catches their breath for the next few months until Google plays catch up once again.

Now no one likes to hit a man when he’s down, but where have all the experts gone? We might get the odd cheeky spammer putting the finger up to Cutts, and you might get the odd insightful whisper from Google but for the most part this is an industry filed with snake oil sellers who have little understanding of the Google algorithm NOR for the technical challenges Google are grappling with and as a result YOU are getting spoon fed yet more bolognese only this time its out of a different packet.

There are great chasms between the reality on the ground and the belief systems within this industry, many of these beliefs are reinforced by Google who have for the most part managed to wage an extraordinary propaganda cue, pushing most of the self proclaimed ‘SEO experts’ in this field into some form of embarrassing submission.

The essence of the story so far is this, SEO companies have been feeding Google with bolognese, Google swallowed it until they puked and when they puked all the chiefs were fired, then in order to make sense of all the puke Google told everyone in the puke to clean up the mess for themselves, which may or may not remove them from the puke. Many are reluctant to touch it and most don’t know the difference between puke and bolognese. Thats were we are right now. With some new ‘false profits’ coming along to tell you what is puke and what is not. But unfortunately this has to be said, its mostly the blind leading the blind out there.

Take the latest article on one of the authority blogs ‘Search Engine Journal’ posted (June 21, 2013):

http://www.searchenginejournal.com/how-to-remove-unnatural-links-to-your-site-choosing-the-best-solution-after-penguin-2-0/

This just annoyed me so much that I was compelled to write this article in response and clarify the nonsense which is commonly being spat out and accepted by these so called authority blogs:

Now while I applaud the general awareness around this subject is growing, I have to call into question much of what is being suggested as best practice on that article and others like it when looking for bad or toxic links.

Take the methods suggested in that article for identifying the bad links, they suggest removing any links which:

  1. Come from PR-n/a or PR0 sites
  2. Are site wide
  3. Come from very new domains
  4. Come from domains with little traffic
  5. Come from sites with identical C class
  6. Are from pages with a big number of external links

Now lets break this down a bit and cast a skeptical eye over some of the Bolognese being written here!

  • Links that Come from PR-n/a or PR0 sites. REALLY – ARE YOU SURE?

Fist off the bat, if you discount all those links which are on PR N/A or PR0 pages, you will be discriminating against around 90% of the pages on the Internet! I’m sorry, but just because a page is currently PR0 or PR N/A doesn’t mean its toxic or bad IN ANY WAY and when has anyone from Google ever said it was? In fact this is not the first time a so called authority has suggested such links are bad and this has been a common misconception which I’ve seen espoused in pretty much every major article on this subject in the last year.

Moreover this is probably the quickest way to build a list of FALSE POSITIVES which will have a negative value and end up causing even more anguish, confusion and harm to a domains ranks if you used this as a signal in your LINK AUDIT and subsequent link removal campaign.

AND THE REASON? It could simply be a newly found page (within the last 4 months) and hence has not yet been updated as part of Googles Pagerank updates (which take place roughly 4 times per year), or it may just be a page on a large site which doesn’t have enough Pagerank (Juice) to pass around all of its wonderful pages and thus is unable to send them pages into the so called SAFE ZONE of a PR1 or above…

This is one of them myths that seriously needs to be put to bed and I welcome the day when someone asks one of the vocalists at Google for clarification or advice on this much lauded white elephant in the room. If you’ve been sold on this as a method to locate your bad links you will have wasted all following efforts in getting them links
removed and no doubt done yourself a disservice in the process.

And what about all the other RED flags raised by the so called experts?

  • Links that are site wide

This is undoubtedly true and the only question left is to how to most effectively find such links from the data you have. These links are normally located in the side bar (blog roll) or footer of shitty domains often along side a bunch of other similarly smelly links all with clear money term anchor texts.

  • Links which come from very new domains. REALLY – HERE WE GO AGAIN – MORE BOLOGNESE!

New domains typically have no Pagerank or Juice value and since new domains are sandboxed (Googles penalty on new domains which have not yet earned trust), Google sandbox outbound links (on new domains) in the same manor, hence if anyone tries to manipulate the SERPS with such techniques they fail. This is not a widespread problem that Google are still grappling with and hence singling out the new domains where one has a link from is just going to add more white noise to any data you end up with.

  • Links which come from domains with little traffic. OMG WHO COMES UP WITH THIS RUBBISH?

This just shows a complete misunderstanding of firstly the nature of the Internet and secondly the problems that Google are grappling with. If Google was to use such a metric or signal for improving the SERPs they would end up with so much false data that they would be totally lost and have no idea what was an authority or why. Most articles have a spike of traffic when first published then that traffic dwindles down as it is buried away. When you build a module into an algorithm you want data which means something significant, not something that will result in pure white noise. There has NEVER been anything said or even suggested by Google that they use such a signal to lower ranks and in all my years in SEARCH I’ve never seen any evidence that such a concept is being used to qualify the value of links! Despite the beliefs to the contrary.

You can put this one into the quack box along with another similar signal I’ve seen used to mark links as suspicious, one called Low Link Velocity, which is apparently when a domain or page is no longer attracting backlinks as quickly as it once did and hence is therefore likely to be a domain which has been sold to a link spammer and is now being used as a link farm. This again could never be used as a signal by Google to qualify links as many domains and pages attract a natural spike of back links when they are first published and interest generally diminishes as time goes by. That’s pretty normal and certainly not a signal which could be relied upon to qualify a link!

  • Links which come from sites with identical C class. FACT OR FICTION?

In general there is some truth to this one, however, its not a simple case of black and white though it’s certainly a useful signal for Google to see whats connected during a manual inspection, certainly also once they have smelt a rat but it is most likely treated as a SOFT signal in the SERPs. They would most likely limit the Juice being passed from one IP to another and these links should most certainly be avoided as it would help them find networks. That said theres another problem which I’ve seen here, I have seen historical data being used in some link audit reports whereby the hosting was changed years ago and the IP data being supplied is historic and hence misleading. Its a good signal ONLY if the data is reliable and fresh!

  • Links from pages with a big number of external links. ANOTHER FALSE ASSUMPTION?

While I’d tend to agree that if a page has lots of outbound links its most likely of low value, though in some instances there are pages that do contain value and have lots of outbound links. Typically where all the links are natural and relevant to each other and where the anchors were not money terms you could be safe to assume the page was worthy. So crudely treating all pages as bad based on the numbers of outbound links alone would undoubtedly raise false positives and there is no evidence to suggest Google is that crude.

So where does that leave us? Pretty much in a mess based on some of these very unsound signals. Raising flags which are unsafe results in huge amounts of ‘possibly’ suspicious links which then have to be reviewed again manually which totally defeats the whole purpose of having a link audit in the first place!

WHAT YOU REALLY NEED FROM A LINK AUDIT SERVICE!

Firstly and most importantly you need confidence that any links which have been flagged to be removed have been flagged for valid reasons. Ironically Google is our friend in this task! We look for actual evidence and signals directly from Google which have their origins in past penalties I have analyzed and learned from. Specifically we check for:

LINKS WITH OVER OPTIMIZED ANCHOR TEXTS

Prior to when Google launched Penguin I used to refer to this penalty as the BackLink Over Optimization Penalty or BLOOP, following a post Oct 13, 2011 on SERoundTable which named it, before that I used to just refer to it as Over Optimization penalty. I first saw such a penalty in late 2009 which was targeted to specific anchor texts that were over optimized and following an investigation and analyses of someone who was penalized, I then built our infamous DENSITY TOOL which I launched in early 2010, back then we considered any anchor texts with a greater density of over 20% to be over optimized. Since that time I have been steadily decreasing the tolerance threshold and now consider any anchor text density over 5% to be over optimized. This is pretty much in accordance with on page optimization (best to
keep that below 3%) where Google dealt a blow to such sites who abused this as far back ago as 2005!

LINKS ON BANNED PAGES

These are your bad neighborhood links. To locate these we have to directly scrape Google and check each and every page where you have a link to see if it is indexed or not. Google are masters at indexing, normally they are greedy and will index every page they can find even if its poor quality, so when they decide to not index a page it is a clear statement and signal from them telling us that page was de indexed for bad behavior. These are ALWAYS pages with suspicious outbound links and in this case the links point directly to your target domain which they have discovered and don’t like! This tool has a long history and following Googles many wars in the past aimed at removing manipulative links I fist launched this in 2009 around the time that these attacks by Google started to become evident. It has been a long standing signal where we have seen bad neighborhood links have had a negative effect on ranks going back as fas as 2008 and possibly before that too.

LINKS FROM LOW TRUST OR BANNED DOMAINS

These are also bad neighborhood links, but different from the above. I am looking here specifically at the TRUST GOOGLE have placed in a domain. We can see that by looking at how well they have indexed the site, typically Google will index every page they find, but when they instead decide that most of a domains pages are not worthy of indexing we can see they have lost trust in that domain, we typically flag domains when more than 90% of their pages are not being indexed by Google. This UNIQUE TOOL (no one else has this) remains one of our most powerful tools and is fundamental to any link audit. It also digs out domains which have been totally banned. I first launched this in 2009 following a penalty I analyzed where most domains links were from low trust sites and this tool became one of our most important and trusted tools very early on.

SITE WIDE LINKS

Following an analysis of domain which had a penalty back in June 2010 I noticed that site wide links were no longer having a positive effect, until that time these links had been much coveted by SEO people because Google often gave sites a real boost with them. It was always a dangerous strategy though as you could end up with hundreds of thousands of links with exact match anchor texts. That said it was common for web design companies to have these types of links as they often stuck a sneaky link at the bottom of every page on sites they made for clients. Those web design types of domains often still get away with these links if there for the brand only, but everyone else wants to avoid Sitewide links like the plague. One of the early problems back then was working out how to find such links as there was not any easy data sources and hence I built our Sitewide tool. This tool has been through many revisions over the years as certain techniques stopped working and data centers closed.

LINKS ON THE SAME IP

Rather than classify everything on the same C class as toxic, I specifically filter out links from the same IP only. I am looking for clear signals which we know Google use and I want to avoid white noise which is speculative.

INTERNAL CHECKS

When we carry out back link audits there are two on site checks we also look at, one is looking at internal URLs on a target site to make sure there are no issues with the wrong types of redirects being used, that often results in no juice being passed around the site, this is called our Unclaimed Juice Finder, the other check is to make sure all the URLS on a target site are being indexed correctly, occasionally this flags up weird issues where Google have decided to remove pages they consider were created to capture traffic only.

LINK REMOVAL TOOLS

When we carry out link audits the last thing we want to do is pass over data to our clients which they cannot easily action upon and simply disavowing your bad links is NOT ENOUGH to get you out of a penalty, despite what some people would like to believe! Google have continuously stated they want you to make every effort to remove any manipulative links before you try to use the Disavow Tool. We have also found that removing harmful links results in better successes when dealing with both manual reconsideration requests and algorithmic penalties. But getting links removed is a hard and arduous challenge, resulting in you having to contact everyone who links to you from these bad resources. This demands that you find their contact information and sometimes the only form of communication is through an online contact form making the process extremely painful if all this has to be done manually.

LINK REMOVAL CMS

We have a LINK REMOVAL CMS which is the perfect place to manage your link removal projects, this tool automatically hunts out all email addresses associated with a domain whether there’s an email address hidden away somewhere on the site or tucked away in the whois information. Our LINK REMOVAL CMS also detects all the contact form URLS associated with these domains.

LINK REMOVAL CONTACT TOOL

There are some other services out there which claim they can email people automatically on your behalf requesting the links to be removed, however, they ONLY look at the whois information for email addresses and miss all email addresses actually on the sites. Obviously this is suboptimal and guaranteed to leave most of the sites uncontacted. To do this difficult task properly you need to not only contact EVERY email address associated with a domain but MOST IMPORTANTLY it is imperative that the contact forms are posted too also.

To automatically post messages though online connect forms is one of the toughest technical challenges I have ever undertaken to solve. To build a tool which can not only automatically email these domains requesting the links to be removed, but the also POST to contact forms where you are often faced with Captcha challenges to solve and have to make sense of arbitrary coding conventions whereby you have to deal with odd mistakes and idiosyncratic forms which have all manor of weird quirks, now that was a HUGE challenge. It took me almost 3 months of uphill battle to build this tool and it is totally unique in this space. It works amazingly well and coupled with a very powerfully written email which has an extremely good success rate at removing links. The best part of this being all automated now is we can send 3-4 follow up messages with no hassle at all.

I also have another powerful tool which is used generally after a few weeks of trying to contact the domain holders themselves, though this tool emails the hosting companies demanding DMCA takedowns of content hosted by them, this also provides an effective and immediate response and together with all the other techniques listed above helps to ensure EVERY effort has been undertaken when locating and removing toxic links.

Any links which are still not removed after a few weeks of trying are then disavowed, in our link removal CMS there is a button which will allow you to download a complete list of links that need to be disavowed which includes comments explaining the number of times you have contacted the companies or where no contact information is found.

SUMMARY

On one hand we have come a long way in the Link Audit industry which has developed gradually over the years but we have to be mindful and avoid the mistakes of the past where ignorance and mistaken belief systems resulted in a poor service and industry apathy. These were the same ailments which backfired in the SEO industry and forced those companies to sidestep their way into a reincarnation after the legacy they created behind themselves imploded.

We are living in a PENALTY STRICKEN world and companies are in the midst of mine field where the wrong moves have resulted in many loosing their businesses and livelihoods. There are NOW clear messages being sent from GOOGLE and that message is loud and clear. You will suffer great loss if you mismanage your back links. You need to constantly monitor your back link profile and have an expert understanding of what is out there linking back to you. You cannot allow SEO companies to work clandestinely on your behalf, transparency and risk awareness are the most important qualities you should find in your advisors.

With regards link audits you need to obtain clear actionable data and have access to the best tools out there for the job. These tools are bespoke and have their origins in historical penalties which have been analyzed and which inspired the tools I later developed. Don’t be misled by false profits whom have jumped on this bandwagon but who lack the knowledge and clarity that is fundamentally required when carrying out such an important and technically challenging task!

UPDATE

As I was writing this article Julie Joyce from Search Engine Land was writing an article (published 25 June 2013) about the poor quality of links flagged when using other link auditing tools from two companies in this space, Link Detox and Link Risk:

http://searchengineland.com/the-problem-with-identifying-problem-links-163602

I personally did not want to call out any other competitors in this field and point out their results were erroneous as obviously that would be personal and biased. Instead I reacted to their published philosophy of detecting poor links (as the SEJournal article was actually written by one of the Link Detox crew).

However, now that the cat is out of the bag and someone has published a detrimental article about their LINK AUDITING QUALITY, this has cast a grey cloud over the industry in general and I have an obligation to protect the reputation of the sector I have been developing in for the last five years.

I have seen this happen in many industries I have been involved in the past where groundbreaking concepts would be developed by one company and others would quickly jump on the bandwagon and copy or try to emulate the features of the industry leader, but in doing so produce totally inferior products and services which ultimately are cheaper and better marketed, then when the general populace try these inferior products or services, often for the first time, are disheartened or disillusioned about the whole industry and revolted by the entire concept or business model / features.

This is precisely what has happened in the SEO industry where now many companies are revolted by the puke their former SEO companies created for them and are stuck in the midst of these nightmare scenarios whereby they are being forced to clean up the mess these cowboys created.

The whole SEO industry has pretty much imploded as a result of bad logic, gun ho – high risk behavior, false assumptions, poor companies and a general apathy towards providing quality services in this sector. These are the same issues we are now facing in the link auditing industry which are being exposed and it needs to be made clear that NOT ALL COMPANIES ARE THE SAME! I may not have been as vocal as our counterparts in the past, but that is because I have been too busy developing groundbreaking technology in this space and not just happily selling services that were knowingly based on fuzzy logic, fraught with misinformation and highly risky to the health of those using them.

This article was written by Steve Carol who is the lead developer at The Link Auditors, he has over 10 years experience coding and has been developing solutions in the SEARCH space since 2008.

Comments are closed.