Google Distorting Reality

The first result on Google for “Dr Andrew Kaufman” shows a debunking hit piece by skeptic Jonathan Jarry who claims that Kaufman is a misguided quack who doesn’t know his left from his right, but even more interesting than that, is the story of who exactly is Mr. Jonathan Jarry? and more importantly, who’s paying his wages? The character assassination hit piece was published by the “McGill Office for science and society”. McGill on appearance claim noble goals but on closer look have some very close ties to Andrew Kaufman’s stated intellectual adversary, the population control freak Bill Gates.

McGill Office for science and society also should be noted to endorse some very bogus claims on their website, for example, they have one article claiming that Polyunsaturated fats are not as bad as some people claim. (WTF?)

Mcgill features another article criticizing Sweden’s anti lockdown approach and claim:

And now, months after this grand, reckless experiment on the Swedish population, deaths had gone through the roof, proving that the Scandinavian country best known for ABBA and IKEA had failed to assemble a rational response to the virus. 

Yes, that same Sweden that has fewer cases than any other country in Europe now according to Offical Google Stasticits, while Portugal who has the same population as Sweden and had one of the lowest death rates in Europe is now imposing lockdowns, and the death rate is simultaneously on the rise, proving that the lockdowns and forced masks are causing the situation to be much worse, in a classic case of the so-called cure causing the death of the patient.


McGill also features another publicity piece claiming that the CODID-19 PCR test is very accurate, (the same test Dr Kaufman so eloquently refutes the scientific bases for), they claim:

The current standard for COVID-19 testing is nasopharyngeal swab and polymerase chain reaction (PCR) testing to detect genetic traces of the virus in the nasal cavity. The test is not perfect, and may come back falsely negative during the first few days of the infection, but it remains the benchmark by which active cases of COVID-19 are diagnosed and, when done properly, seems to be very effective.

While the Judge at the center of the now world-famous illegal Lockdown case which ended up in a Portuguese courtroom where the judge threw out the governments Appeals and stated the tests were more than debatable in fact they stated:

Indeed, they cite a study that suggests only 3% of positive PCR tests declared by health authorities may be ‘true positives’.”

Like Karl Popper in Falsification, Kaufman (in the video below) destroys the foundational evidence that is put forward as proof that Covid-19 is claimed to exist, it is now up to real scientists (not hitmen) to prove Kaufman is wrong instead of engaging in petty claims that Kaufman should not be trusted because he is just a naturopath (this false claim is clearly libelous according to Kaufman’s incredible BIO & CREDENTIALS Kaufman is a lot more than just a Naturopath) and McGill is currently sueable for libel!



In fact Jonathan Jarry is a paid contributor and intellectual assassin working for a Mcgill University with the sole financial backing of a shadowy organization called the Trottier foundation that claims to promote sound scientific information and has a strong leaning towards climate awareness, carbon taxes, and all things good and scientific.

The Trottier Foundation (private Canadian charitable foundation established under the Canada Not-for-profit Corporation) was created off the back of the money earned from a company founded by Trottier called Matrox which makes graphics cards and sold them to IBM and Microsoft, and who obviously has direct connections with one very well-known family of eugenicists -> Bill Gates.

Our mission is to support organizations that work towards the advancement of scientific inquiry, the promotion of education, fostering better health, protecting the environment and mitigating climate change.

Along with being an Honorary Doctorate of Science, McGill University Lorne Trottier was also a Member of the Order of Canada and later promoted to Officer, by the Governor-General of Canada Her Excellency the Right Honourable Julie Payette who is a 33rd-degree member of the Illuminati and ACCEPTED SCOTTISH RITE OF FREEMASONRY OF CANADA!

The facts are clear that Google in association with other eugenicists is hand-editing ‘their’ search results and Youtube content to remove and character assassination valid scientific dissenting debates while attempting to illegitimately make the case that the science is clear, that lockdowns are good, masks are good, the Covid-19 PCR test is good, vaccines are great and the people who question any of that fake science are bad!

Nothing is further from the truth, all that Google and their counterparts are succeeding in is waking up more people to how grossly underhand and interconnected these masonic NWO Great Reset Technocrats have become!

Want more scientific connections Google? It is noteworthy that Mcgill is closely affiliated with

As evidenced here:



Google Takes Position In Controversial Vaccine Safety Debate

Google has recently (June 2019 ) used their power to manipulate public opinion in favor of the Pro Vaccine Argument! In the first-ever attempt to influence public opinion, using powers at their disposal, Google “search” has decided to actively manipulate available information by removing controversial online discourse raging over vaccine safety from the results they provide. Websites who promote or provide a platform for critical perspectives of vaccine safety, questioning the lack of safety studies or vaccine efficacy, found that Google has unilaterally removed the search traffic they once enjoyed, without engaging in any debate or providing any notice, or any explanation!

‘Don’t be evil’, Google’s motto during their growth phase, different from ‘Do no evil’, was “chosen because it leaves room for honest disagreements, but still encourages Google to strive to make the world better“. But are Google still striving to make the world better by removing ‘honest disagreements’ or has there been a change of policy? Are mottos like ‘don’t be evil’ or ‘make the world better’ compatible with unilaterally and secretly deciding what specific information is made available to the public by removing honest disagreements and controversial sources of information (claimed to be misinformation), while quietly choosing which narrative will prevail?

Incentives to manipulate:

Has Google ever profiteered in the past from removing controversial information from the results they provide? I can answer, to my knowledge, NO! Google has never before removed controversial content or had an agenda to create profit directly or indirectly as a result of removing honest arguments from their results. But it has been common practice for Google to profit while removing commercial companies because they allegedly violated Google’s ‘terms of service’ or allegedly failed to meet their ‘quality guidelines‘.

In fact, Google has a long history of doing such, and have traveled down a path, starting with, profiting from removing ‘webspam’, to profiting by removing companies whose page load speed or technology was not up to standards they imposed. It has been a slippery moral slope whereby ethical standards have slowly been eroded, to the point that they now see no hindrance in actively, aggressively, and secretly, enhancing profits in one industry by using their power to censor access, and dispel negative information about certain products. 

Public Relations & Reputation Management

Typically it has been the realm of public relations companies to deal with reputation management, who would engage in this arena and try and manipulate public opinion or preserve the reputation of individuals, companies or products, in the interests of preserving profits or preserving a companies reputation or one’s dignity.  But when you remove negative reviews, personal opinions, product complaints, honest scientific discourse, intellectual debate, because it interferes with profit that is gained from selling products, because those products have a reputation for killing or injuring the consumers,  that could be an act of EVIL if it finally turns out that there is truth in the arguments that the current vaccine schedule is severely miss-reporting the cases of brain damage and death caused by adverse reactions. 

Google’s own reputation management department could frame this as either an impartial update, whereby the bias was caused because of collateral damage, a phenomenon that has happened in previous Google ‘updates’ which were intended to remove low-quality results but then incorrectly impacted high-quality sites, but most fittingly this will be framed as Google taking a stance against evil vaccine skeptics who are putting lives at risk by exposing people to curable illnesses with misinformation about vaccine safety.

When we take into consideration the wider picture and this apparent statement by google:

“Google, which owns YouTube, promised to deprioritize antivaccine and cancer quackery in its search results and to ban people and companies posting such content from monetizing it through its ad-serving system. Indeed, at the time, a number of prominent purveyors of medical misinformation started complaining loud and long about this, which is not surprising given that running YouTube ads on videos hosted on the platform can generate a considerable amount of income when the videos are viewed thousands—or even millions—of times.”

While I cannot find the direct quote from the Google ‘search’ department this is what Youtube/Google had to say:

“To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways”

Which is in-line with a stance recently adopted by other social media tech giants who have also announced they too are deprioritizing misinformation about vaccines:

Facebook, Amazon, YouTube and Pinterest are all taking various actions to hide, downgrade or otherwise control the spread of anti-vax content.

The companies are responding to the uptick in public attention that stems from certain lawmakers and the news outlets covering them.

“We are on the verge of a public health crisis,” New York Assemblywoman Patricia Fahy said, following her sponsorship of a bill to allow minors the chance to consent to vaccinations, regardless of their parents’ opinions.

In addition, California representative Adam Schiff sent letters to CEOs of Google, Facebook, and Amazon, urging they “consider additional steps [they] can take to address this growing problem.”

The pro-vaccine Vs AntiVaccine controversial debate has reached all-time highs with Google and the big tech companies now taking an active stance in this subject while siding with the pro-vaccine side of the argument. This is significant because it is the first time that Google (as far as I am aware) has ever picked a side in a controversial public debate over anything by editing their results in favor of one side of the argument.  Sites that provide a platform for the antivaccine perspective have been obviously penalized (removed) in the June 2019 update.

For those that are not informed as to this particular controversy, the debate toils over controversial claims that aluminum unnecessarily (added to vaccines to extend their reach) is a powerful neurotoxin which has been widely studied and is proven to be dangerous to human health, but the controversy goes that the micro amounts used in vaccines to extend their potency (believed to be harmless), is causing brain damage in a small subset of suspectable individuals where the blood-brain barrier is not adequately developed, which can result in Autism and death or other conditions such as Alzheimer’s disease.

Within this controversy, one side claims that the anti-vaxxers are dangerous to the wider population and are responsible for a resurgence of epidemics of measles and other preventable illnesses because herd immunity is not reaching high enough (>90%) proportions, while mandating and insisting  governments make it a legal requirement to force populations to conform to the full vaccine schedule which has risen in recent years to nearly 100 individual vaccine doses before age 20.

Both sides have strong arguments which divide communities and families, but ironically, there have been no official studies conducted to-date to prove scientifically that the amount of aluminum collectively used in the recommended vaccine schedule is harmless, and without such science being available, the pro-vaccine side resorts to downplaying the known risks and airbrushing the evidence to paint vaccines as safe and continuously point to evidence that is not conclusive, inadequate, or which has already been shown by the anti-vax side of the debate to be flawed.

Whichever side of the debate you are on, depending on your level of knowledge of the subject and how likely you are to follow orders from men in white coats, you will be torn between the acceptable known risks of vaccine adverse reactions Vs the risks of preventable illnesses. But are the risks being downplayed as claimed by the anti-vax side of the debate?

What is not yet known is which side will win this debate,  as it is still yet to be scientifically settled. Until such time that scientifically proven studies are conducted (and repeated) to prove there is no statistically significant increase in brain damage (autism) following the approved vaccine scheduling, then this debate will rage on dividing communities, families, and nations. Ironically it seems to be generally accepted that all vaccines increase autism risks but we just don’t know the numbers.

Moreover, this debate/divide it is getting very serious with propaganda being used on both sides, social media activists becoming increasingly connected, louder and more effective, with social media companies and big tech companies being drawn into this debate and starting to take sides. Lives are at risk, profits are at risk, with both political and economic power being used to apply leverage in favor of the pro-vaccine argument.

With that background, Google has now decided to enter this debate and help settle the argument once and for all. Not by providing funding for scientific studies which prove the amount of aluminum used is safe, or promoting scientific studies that prove vaccines do not increase the risks of autism, death and Alzheimer’s disease, but instead they have decided to remove (hide) from Google search results, websites that provide a platform for the anti-vaxxers and sites that provide information in the public interest which contains evidence or information about vaccine risks.

Google Core Updates Explained

Google for the last 15 years has been in a constant war against what they term web spam, which are websites that manipulate Google search results using ‘tricks’ that are against Google’s terms of service, and which distort the natural search results in favor of commercial entities who are profiting from that search traffic which they ‘stole’.  Skilled manipulators have been able to decode the algorithm Google has developed and have taken over search positions they wouldn’t normally be given.

Until now the Google Updates have always been a black box, with Google not under any requirement to reveal their inner workings and fearful that such information would assist manipulators, Google has more secrecy around their secret search algorithm than Coca-Cola secret recipe. They do not give away any information and actively try to confuse the SEO community to what is going on behind the scenes. With that said, it is not impossible to prove the arguments and claims being made in this article.  

We have exclusive software unique in its ability to decode Google’s ‘Updates’. we are able to categorically assert that this update is different, different to all other Google updates ever performed. All previous Google Updates have targeted so-called ‘manipulation’ or quality issues, with a traceable cause and effect.

Prior logic would be, Google search results are manipulated by X reason, and Google can fix this by applying penalties to websites which engage in X. Normally X would be something like ‘keyword stuffing’, ‘paid links’, over ‘optimized keywords’, ‘various link schemes’, duplicate content, ‘page load speed’, ‘mobile responsiveness’. etc. This would involve isolating sites with particular features that were not providing quality content, or a quality system for users. The difference with this update is that Google has now targeted sites that don’t have the ‘right argument’! Or content that Google classifies as misinformation.

Evidence for an official site with a skeptical perspective about vaccines:

Evidence for GreenMedInfo, a site with skeptical vaccine perspective:

Evidence for a pro-vaccine porthole intellectually and ideologically connected to and share the same IP!


Should this be a public debate? Should the public even be aware that certain information is being actively censored by Google,  or is it right that Google operates a black-box deciding and editing a narrative of the world to suit their own biases? Is that fair? Morally right, or wrong? Or even evil? Which agendas are being satisfied? Are profits being gained from such manipulation? Should Google quietly and secretly censor information that derives from honest but controversial disagreements?  Can we agree that secret information censorship by one of the largest tech companies in the world, who are arbitrators of information dissemination, is a good use of their power, and that Google is making the world a better place by doing so?

Can you trust Google?


This is a controversial debate, while no one can argue that vaccines save millions of lives each year, and on balance, they are an amazing scientific discovery,  but there are known risks in using them, which are controversially debated with arguments ranging over how risky they are. It is believed by the anti-vax group that the risks are much higher than claimed and autism, HIV, and other harm is being caused by the vaccine schedule. Because it has not been proven scientifically that there are no extra risks, the only way to settle this debate is to do the science, not compelling large tech companies to do the bidding of the vaccine companies. This cannot bode well for their reputation which is already on shaky ground, especially if in the future, scientific studies end up proving the risks are higher than currently accepted.

Interconnected Networks & Bipartite Graphs – Penguin4

As Google continues to improve upon their ability to detect manipulation of the link graph and henceforth penalise yet more subsets of the Internet, this recurring theme over the years has led to an increased interest by those reliant on search traffic to try and understand more specifically the nuances that may affect their rankings and subsequently their businesses.

We noticed this latest (Penguin4) update was better able to understand and target websites that have unnatural linking patterns which can be detected by looking for domains that are linking to a target which also commonly interlink to the same other resources. I.e. where some domains link out to many targets that other domains replicate or have similar outbound links too. By looking at the correlations between different domains through their outbound links and applying further filters to that data, it is possible to discover link networks such as directory networks, PBN (private blog networks) and other link schemes where manipulation of the link graph has ‘more likely’ been an intentional aspect of the linking practices.

Interconnected Network

Following Google’s Penguin:4.0 update 23/09/2016, I hypothesised that one specific improvement Google has made to the algorithm was their ability to use Bipartite Graphs (dense networks) through co-citation combined with other manipulation signals to detect Isolated networks that were more likely created to manipulate the search results.

Naturally Google has a patent on this concept dating back to 2007 but what we are observing with Penguin4 suggests that Google’s implementation of this concept has been recently improved upon and is now better suited to distinguish between naturally occurring dense networks Vs isolated manipulative networks.
This aspect of manipulation detection via dense network comparison is not widely understood within the SEO community and the discussion on the concept is thin on the ground, least for two very interesting resources that have covered these concepts in more detail, one being Bill Slawski of SEOBYTHESEA who talked about this area and has done so ever since Google published their patent which Bill covered in this article, and this excellent article by   which goes into even more detail about link graphs and how co-citation works to provide insights into relevancy, which he expanded upon in his research paper titled Relationships in Large-Scale Graph Computing. I am sure there are some more relevant articles from the SEO community that I have missed and I will continue to post any relevant information I come across retrospectively in the footer of this article.

Outside of Google there were (we have now developed one)  no commercially available tools which examined the link graph of a domain and allowed you to visualise or extract these dense interconnected networks, so over the last three months I developed a tool for The Link Auditors known internally as the Interconnected Network Tool, which is now offered as one part of several other link auditing tools that make up our link audit package. Developing this tool has already been an incredible journey, thinking through concepts and methods needed to extrapolate this data and make sense of it in the same way as previous academics and researchers within organisations such as Google has been extremely stimulating and allowed me to learn many important lessons and concepts relevant to link graphs and SEO manipulation, as I develop the overlaying filtering and sorting algorithms further these insights continue to evolve.

Our Interconnected Network Tool


STEP 6 Interconnected Networks
NEW TOOL Penguin4 launched 23/06/2016 targeted domains that have interconnected networks via co-citation and bipartite graph analysis. This new tool traces these dense networks and applies further filtering to weed out natural authorities from manipulative link networks.

Visualising the link graph of your domain is not just beautiful but also incredibly useful for both competitive analysis and to understand why Google both promote and penalise domains based on link graph anomalies. Comparing different domains by visually looking at their link graphs can provide huge insights that are otherwise hidden. Uncovering relationships between entities is almost impossible using any other method and having large data sets allows detection of subtle anomalies that can only be noticed with large data, no doubt removing manipulation based on these signals can have a huge impact on search results.  

Clearly this is where Google has been focusing for a long time (especially noticeable in the last Penguin4 update) and given the lack of insight about these concepts within the general SEO community at large, it is no wonder why the chasm has opened between the SEO community and Google. With most people failing to grasp what is going on and many others having given up entirely trying to manipulate Google with SEO; while Google seemingly has become resistant to many forms of manipulation. That’s not to say Google is winning outright and there is still large amounts of manipulation evident for all to see, only that the bar has been raised consistently and most people who formally engaged in this art are without the tools or the insights to understand the science of this endeavour. For every 1 that we see now successfully managing to manipulate Google we can see another 50 have been thwarted. Those getting good results now are either lucky or have a deeper insight into these concepts and can join the art and science together to make manipulation work.

Our Subsequent Findings

We are actively reverse engineering the results and establishing a pattern that is consistent with the assertions made by comparing domains that we know have manipulative links pointing to them. In this case study we are looking at two domains (RED & BLUE) both differ in the way their links are intertwined, the BLUE domain has hardly any interconnected links whereas the RED domain has many of its links shared with other interconnected domains. Both domains have purely manipulative links and yet with the launch of Penguin 4 update the RED domain is immediately negatively impacted whereas the BLUE domain steadily rises to the first position in for a highly profitable search term!

BLUE domain interconnected links:

propertyThis image clearly shows only a handful of interconnected links despite having a relatively large link profile while the RED domain (below) has many more Interconnected links and a significant section of an isolated network.

Isolated Network

This image (below) shows the Isolated section of the RED domain whereby the manipulative links that are isolated have been extracted only, these links would be considered to be linking in a way that is statistically unlikely given the overall signals which can be obtained by comparing how these differ from other more natural hubs and nodes.

I am slightly surprised that it has taken me us in the community so long to emulate what Google is doing in these areas, while Bill Slawski has been almost alone in deciphering what they are doing from their patents and maybe only a handful of us have been trying to decode Google by building tools to unravel their inner workings. These challenges are fairly difficult to do given the hurdles that need to be overcome, having the history, knowledge, insight and resources to develop such tools probably falls close to only a handful of people or organisations but I would have thought some of the bigger outfits would have been all over these concepts and built tools specifically targeting such areas. Clearly though having unlimited resources to employ academics and smart programmers, pays off over time and that is what is making the difference between winning and loosing this battle.

Google claim that people who try to manipulate their search positions hurt everyone who uses the search engine and that much of the manipulation is pure spam, or relates to porn or scams. This premise is used to justify the continuos efforts to weed out artificial linking practices that aim to manipulate the search results. Yet there is a legitimate interest for anyone who has a website to be found for relevant search terms relating to their website and that drive is ultimately what has led to the commercial success of Google’s Adwords program I.e. paid placements,  and the success of both Google and Alphabet INC. Maybe if Google were only penalising Porn sites, scam sites and spammy sites then there would be some understanding for Google’s sanctimonious, righteous crusade but while the majority of sites that are being penalised by these updates are normal small business owners who are simply trying to improve their traffic then it is very difficult to maintain the facade that these penalties are not someway driven by increased profits for the giant pay to play information highway.


Further links to relevant resources and discussion from the SEO community:

Search Quality: The Link Graph Theory by Dan Petrovic
Relationships in Large-Scale Graph Computing Article by Dan Petrovic
Google Patent on Web Spam, Doorway Pages, and Manipulative Articles 11/27/2007 by Bill Slawski

Further links to relevant resources and discussion from the academic community:

Authoritative Sources in a Hyperlinked Environment Jon M. Kleinberg

What happens to your Google Rankings when you disallow all in robots.txt?

Screen Shot 2016-11-21 at 17.14.08

Something Not Good

Screen Shot 2016-11-23 at 15.46.12

Though it would be worse if it was also applied to the dot com version – wait it is…

Screen Shot 2016-11-23 at 15.55.07

But what sort of organic traffic loss would we be expecting here?

Wait that’s thousands of lost business leads or sales PER DAY.

There’s over 200 people with an interest in SEO working in STUBHUB (stubborn hub?) so one of them must be interested I guess:

Screen Shot 2016-11-23 at 16.02.11

Would someone please let them know…

Penguin 4 Will Not Take Effect For Months Yet!

Google's New Penguin Pussy cat

Screen Shot 2016-10-04 at 18.04.44Reading between the lines and looking at the data reveals some further (to our Penguin 4 review) interesting insights about the nature of the current implementation of Google Penguin 4 and the removal of Google Penguin 3. Let me jump in and make some statements:

1) Penguin 4 is no longer a penalty, instead it's as soft as this pussy cat.
2) Google's manual spam team has been largely designed out with Penguin 4!
3) Google are implying that disavow files are now largely redundant (do they get finally it?)
4) Google's new granular formula takes a long time to fully compute and it will take months come in full...
5) Penguin 3 was largely removed around 2nd September to make way for Penguin 4, introduced 23rd Sept

Understanding the changes

In the old format of detecting link spam, (even before Penguin) mostly spam was detected with algorithms that would detect suspicious activity such as multiple links from one IP to another, gaining too many links too quickly, too many (spammy) anchor texts on a domain, spam reports by competitors, etc. then once Google's suspicions had been arisen, the spam team might take a physical look, then kill the linking out domain and leave notes about any penalised target domain, all of this would be totally invisible to the public apart from them seeing ranking changes. As well as they may add a penalty score that would automatically expire after some variable time to the guilty parties.

Google's New Pussy Cat

The new method of detecting link spam improves on these systems and introduces a few massive new changes, that of comparing every new link they find against the entire link graph of that domain and also against all the other sites that are also linked too and looking for any unnatural comparisons. Iterating though arrays of links in such a way is a slow time consuming task; it's a massive task to do on any scale no matter how much computing power you have! BUT, this is the most compelling and intelligent method of finding unnatural links because it is so common that domains that spam have one thing in common, they don't do it in isolation, they do it as a common practice, and these practices can be detected by comparing everything against everything else (read slowly).

Google's new method includes a function to classify networks owned by single entities, hence if they discover one entity owns more than one domain, that group of domains will be grouped and classified as a network, these groups will be the key to understanding the value of a link and the system will be able to detect natural links within networks VS unnatural links, according to the commonalties between the domains linked within the networks (read granular), and the accumulative level of suspicion between the two entities (groups).

That means that you will be judged on your total discretions across multiple domains and it's an accumulative penalty not isolated to each domain, that means that the focus is now on all domains owned by one entity and once suspicion has been aroused they will be comparing other related domains and looking for patterns. But all of this focus is more specific to the target domains (domains higher in the ranks) than it is to the general link graph. Because it is Penguin 4 and related to manipulation of search ranks the whole focus is now on removing manipulation and doing it algorithmically, thus there is going to be more white noise or collateral damage with this approach hence it has been necessary to simply remove the positive effects of suspect links rather than actually penalise domains with any certainty.

How accurate can this new method be? (Read Aggressive Tiger or a soft Penguin?)
Because this is a highly intelligent (processing heavy) method of eliminating the effect of manipulation (spam) and the effects of this new system is going to be much more granular and as a result much more effective (read laser precision weapons instead of carpet bombing towns), so much so that the confidence Google has in this new method has allowed Google to retire the old spam team and their old methods no doubt with huge savings on manual labour, offset by the cost of much more processing power now being applied to the problem. This will also mean that the effects on the SERPs will be significantly greater than what has currently been seen in the past, thus in my estimates the noticeable affects will be something like:

Penguin 3: percentage of the SERPS affected 6%
Penguin 4: percentage of the SERPS affected 18%

However the changes will not come into full effect in one huge update, but instead this should take many months to fully recompute the current link graph and then reclassify the link graph as they know it!

Google Penguin 4 Review

Analysing Google’s Penguin 4 update: What do we see and think so far!

Reviewing Google Penguins Vs the Porkies

1) The update actually took place on the 2nd of September, NOT the Friday 23rd when it was announced!

That’s been a typical trait of Google over the years with other major changes such as Hummingbird (which we noticed before anyone else), and the first wave of Penguin etc. of announcing some weeks after the actual launch. This allows Google to go backwards without informing the SEO community of whats going on, if things don’t seem right or it all goes wrong.

2) This update has been embedded into the core algorithm.

This has been a massive internal change in Googles engine. Why? First, its taking Google over two years to implement the latest Penguin update which is a hell of a long time knowing the gravity of the situation that the many affected were left in limbo during this time. So why did it take so long to implement this as a core ongoing realtime penalty? The fact is that the entire engine that we know as the core algorithm would have had to be substantially rewritten to accommodate this change. Look at the evidence, Penguin when it was first announced looked for any signs of manipulation on specific keywords and while the negative effects were applied to sometimes specific terms only, this was most likely implemented by hacking other existing features in the core algorithm which was not an ideal solution (limited) and moreover would place the sites into limbo by not giving them their due rewards upon receiving further positive signals (links), as such, once trust was lost domains could get away with a lot less. Making subsequent gains almost impossible. Thus if affected (caught) then those specific domains would often be unworkable from an SEO POV. Rendering all SEO work almost worthless and moreover calling into question any resources that were used to link to already penalised domains, as well as the efforts of the SEO agencies tasked with the challenge to help recover or improve such projects.

When Google talk about making the change to a realtime ‘granular’ penalty, they have conceded that in its prior implementations Penguin was far from ideal, while better than nothing, no doubt it was difficult to rerun, missed many instances of actual link manipulation, relied on hacking other aspects of the core algorithm or limited the core algorithm in some other way, so as to be necessary that an entire rewrite of the code was necessary. During this time and process it seems as if all efforts have been focused on this process, to the detriment of many other tasks and general work that would take up the focus and time of the engineers or people working in the spam team to spot and remove spam.

What effect is this Granular thing they talk about going to have?

Google seem to have paused doing manual spam detection since about the end of November 2015. This is also evidenced by the lack of link networks being reportedly discovered over the last year by blogs such as Seroundtable. To me it’s as if the focus has been placed on higher level projects (such as this rewrite). So hunting for link networks by hand using spam teams would seem to be something that would be designed out of the algorithm ASAP. Especially if the benefits of such work would be short lived or no longer used. In any rewrite of the core algorithm it would change almost everything that had come before, and the reliance on manual IP blocks (blacklists) or hand complied data about domains would probably be superseded by new ideas on that would be figured out and implemented by machine learning techniques and hence be able to be subsequently applied granularly.

Rather than relying solely on spam reports (from your competitors) and manual reviews to detect link networks or paid links, one of the most interesting methods to discover link graphs is to build analytical tools that work through link graphs that look for unnatural patterns of links (whereby domains link to uncommon resources in unnatural ways) such as domain A and B both have links to both domain C and D, to find unnatural correlations and patterns, and while this could happen in the wild (without manipulation intended) it is something that has been of interest to detect spam for some time although it is made incredibly challenging by the sheer size of the link graph and the way in which Google have developed. If Google were to try and analyse the whole link graph looking for such anomalies every time they found a new link it would be too slow and anything that’s too slow is unworkable over such large datasets. Though with recent changes especially going back about 1 year and following the Hummingbird engine update, Google have been able to detect and find find these needles in haystacks more efficiently.

Getting back to what has happened in this update, Google have taken the above concept of using link graphs to find spam and have applied it in another interesting way. They have recognised that entities who own more than one domain often engage in similar practices, as such, if a penalty situation occurs on one domain, or if the practices affecting a group of domains owned by one entity amount to manipulation, then by looking at and grouping all of the domains owned by one entity they can detect a whole lot more potential Penguin like activity, and target specific keywords across multiple domains. As such if you owned a bunch of domains all trying to rank for the same terms, or some similar terms, then in this update you can expect to see targeted effects whereby all of those domains will be hit on those specific keywords. It’s not like a duplicate content filter whereby one page will win, it seems to have a negative total effect.

3) No link graph refreshes since last December 2015.

The fact that Google have for the last year not performed any link graph data refreshes is odd. Typically we would see up to 1 to 4 such refreshes every year. This is when Google reanalyse pages which they had already factored the effects of such links in the SERPs. So for example, if you have a page with a link on it then you edit the anchor text or remove the link sometime later, Google would not notice this change until they did another link graph refresh. These rare refreshes used to result in sometimes huge upheaval in the SERPs while Google reflected the actual changes that had happened. Contrary to what most people in SEO think, Google do not have the resources to continually reanalyse (as they crawl) pages looking for minor edits, once they have already factored that page once. So why no new refreshes? This could very well be connected to the current rolling Penguin update, maybe the teams were too busy building this implementation of the algorithm to implement the refresh or something about the last Penguin 3 update rendered it impossible or not worth doing. That said despite the new rolling Penguin 4 update going live a data refresh still has not happened yet. Again this is odd since previously big updates would be preceded by such link graph data refreshes. I suspect this is yet to happen and may well be impacted by the rolling type of Googles core algorithm. Certainly something is conflicting Googles ability to run either of these previously manual type events and most likely the data refreshing has also been baked into this current implementation of Penguin.

4) The unannounced Panda?

At the start of June, Google implemented what appeared to be the start of this rolling update algorithm engine change. That’s when I first noticed something significant (Panda like) had changed at Google, following a long hiatus whereby Google seemed to be asleep, this update signalled the start of this wave of changes. But it was a significant change not like any singular Penguin or Panda update, it was new, connected to both, and had some unique traits (such as results disappearing off the radar without trace). At some point Google will have to do another data refresh but given the recent changes it’s highly likely that it will be a granular event from now on, whereby Google are planing to recrawl and refactor what they know and understand about all pages in isolation to others, as they go. So no doubt there will be no more noticeable major link graph refreshes instead we will see rolling type effects that find and refactor link data according to Panda and Penguin in granular portions.

5) The Effects Are Still To Come…

Given there has been no sign of the link graph data refreshing, the Penguin changes that Google have just implemented on the 2nd of September would date back to link changes (edits/removals) made before December 2015 and more current disavow files. That means that if you did clean up work prior to 2016 then you should have seen these effects on the 2nd September 2016 update / implementation, whereas if your cleanup work was done during 2016, you will be still waiting for Google to reflect these changes (your link removal work), which they will do when they start refactoring pages that were already analysed and factored into the results but have changed or been removed during 2016.

Google Penalty Recovery Service

It’s often thought that recovering from a penalty consists of inspecting the back links, finding the toxic ones and then disavowing them. ON the country this simplistic approach has been shown time and again to delay the recovery process so much so that companies sometimes waste years in a helpless state thinking they have done everything they can according to Google and that somehow Google is victimising them unfairly. Whilst that viewpoint is often the most comfortable perspective to espouse by the professionals working in this space, we at the link auditors have a very different perspective.

Companies working to remove penalties have an obligation to undo the manipulation in order to show good faith to Google and satisfy their demand for toxic links to be removed, thereby undergoing the difficult work of reaching out to the linking sites and asking them to accommodate the removal of any links placed with the intention of manipulating Google.

That turns out often to be a difficult conversation sometimes involving payments and at least a lot of work on both the sides of the fence, and for a bunch of people who are otherwise not interested in your companies plight.

Having such a conversation demands one a difficult task, that of making contact and thus finding the contact details of all said parties, that may be an email address or a contact form on their website or whois page, which amounts to a massive task when challenged with sometimes hundreds of webmeisters that may be linking to your site.

Toxic Link Removal

One of the most important tools we have developed here at the Link Auditors is our own in-house outreach tool that not only finds email addresses and sends polite link removal requests, but it also locates contact forms, then actually posts into those forms the same polite message, breaking any Captcha challenge that are placed to prevent the forms being posted to by robots with spam. Evading such technological barriers demands some very smart technology and this is one of the most difficult challenges we have had to overcome when we set out to make such a tool. Not least was there the problem of breaking the Captcha challenges but you wouldn’t believe it but to actually find these images on the page was also extremely difficult from a technical pot of view.

Of course humans can easily distinguish what is a captcha challenge, but if you are looking at a web page and you are a computer program, understanding where on the page it is located so it can then be broken is actually really hard. One might assume it would be easy as the input field may be called captcha or something similar, but again disparate forms have all manor of naming conventions for the input fields that must be understood, some may be just named with numbers and as such the challenge becomes making sense of all the input fields in order to put the message into the message field and the subject in the subject field.

On top of this you sometimes have two such forms on one page, one maybe a search box, then another problem is other unexpected questions that might be asked, what is your message about, etc. and also perplexing is understanding the html code, sometimes these are not conforming to any standards, have poor code and bugs which makes decrypting them even harder to break, while sometimes they will post into new pages with no standard use of relative or absolute paths, meaning that just understanding where it is supposed to be posted to, that is /sendmessage.php or such vs ../ etc. this can take quite some interpretation with many potential pathways having to be considered and much work to make sense of anything that you come across.

Lastly maybe is the issue of cookies and sessions that have to be managed effectively to emulate the path any human user would take. It is no wonder why we are the only company in the world in this sector who can boost to offer such a feature and it is by this measure which any company should be judged in this space. If they cannot remove the toxic links, they are wasting someone else’s time and money!

Lets take a look at how this works
Screen Shot 2016-01-11 at 14.46.10

Contact Us to get your toxic links found and removed TODAY!

Long Anchor Text, Avoiding Keyword Stuffing

Since Penguin people understand that over optimization of anchor texts is bad, they know that they should rotate their keywords and avoid using the same anchors over and again, what people don’t often know about is the potential problems caused by keyword stuffing with long keyword anchor texts: example => “shipping household contents to Sweden with cargo containers” this maybe a long tail keyword anchor text (and seem unsuspicious) but it also contains six potential key phrases and many combinations therein, this will typically be un-tolerated by Google’s algorithms which understand the natural anchor text general pattens and have little tolerance of the unnatural.

Whilst it is perfectly acceptable to use long tail keywords as anchor texts, one must be careful to ensure these are not over stuffed with individual keywords with many money terms included, instead more usage of stop words (words with no meaning, such as ‘here’, ‘more info’, ‘this link’, ‘it,was,then,when’) etc is recommended. So as a general rule of thumb, if you have more than six words in your link anchor text, make sure at least no more than two of these words are money terms!

For more insights into your websites links and anchor text makeup we invite you to do a full link audit where we can check many other aspects of the health and status of your links and help you improve the natural appearance of your site in Googles search engine.

Avoiding the Google Panda Penalty Case Study

Recently The Link Auditors were employed by the popular documentary film website Doc-film-net which was under a site-wide penalty for all search terms that it used to rank for, such as ‘download documentaries‘ and free download documentaries, search terms which are highly trafficked and resulted in significant loss of traffic for the site.

After a full link audit and investigation of the site, we discovered many pages being indexed incorrectly by Google, these were typically comment pages, that added no real value to the site and were being produced by the bespoke content management system ‘incorrectly’, this is a typical problem of poorly designed CMS or any CMS where the developers have not envisaged such a problem being caused by having surplus pages being spat out unnecessarily by the system. AN EXTREMELY EASY MISTAKE TO MAKE!

As a result the site had over 20K pages indexed by Google and this should have been less than 300 in reality, with careful use of the No Index meta tag, with some very minor on page changes to the site we watched the number of pages indexed by Google reduce to around the correct value and with this the positions in Google search results returned naturally without any form of reconsideration request necessary, nor were there any notifications in Google webmaster tools.

As a result of the reductions in pages indexed so did the positions return in Google and along with that the traffic to the site and we can start to monitor the traffic with our new Google position report service and you can see the position report:

So Be Warned!

Google Panda is often de-indexing sites and lowering the postions simply because your CMS is spitting out too many false pages which Google perceive as poor quality content, and penalize you as a result!

How to do Due Diligence when buying a domain
(or new online company!)

Doing due diligence on a domain name before you buy it is obviously highly recommended, else you face years of trouble trying to remove toxic links and a multitude of GOOGLE penalties both algorithmic and manual. Online companies have one massive weak link, they all rely upon search engines for traffic, who at the whim of Google and in one stroke could be set up for years of pain and anguish. There is nothing more dreaded than a notice in Google Webmaster Tools informing you of a penalty.

Penalties are even dished out sometimes by mistake, companies who have never engaged in ‘SEO’ and never tried to manipulate Google have contacted us in the past with clear penalties they have been hit with that were due to mistakes by others hosting content that seemed to Google to be manipulative but instead were simply hosting quirks (lots of different domain names (news sites) with similar content) or payday loan sites linking into official sites (debt advice companies) to try and pass themselves off as legitimate sites by doing so.

There’s a long list of companies being hit by collateral damage as Google have rolled out various ‘updates’ over the years that have had the effect of pushing companies over the edge and into the cesspit of unknown penalties or known penalties and ranking difficulties.

So that leaves potential new buyers in somewhat of a precarious situation when considering buying a business that someone else has built from the ground up, chances are that along the way many mistakes have been made with all sorts of aspects of the technology while the developers grappled with the challenges that come with online businesses, not least from a technical point of view to do with particular functions of the service, but also with prior SEO marketing.

SEO marketing has evolved massively over recent years, and what was once considered by ‘most’ of the ‘so-called’ professionals in this sector to be ‘white-hat’ or acceptable practice has often later been the target of Google’s wrath and have been specifically targeted for spam practices. In practice this means that the majority of domains over 5 years old who engaged with the murky world of SEO (or more likely and more dangerously) have engaged so-called professionals on their behalf (that means most of the smaller companies over 5 years old who rely on Google) have most likely made such mistakes and have at some point in the past been the target of a Google penalty or will likely be targeted at some point in the future…

That acknowledged, the most important Due Diligence a prospective new buyer can make on an online business is to perform a detailed and professional link audit on a domain to understand exactly how it has been linked to over the years and the extent of the SEO work that most likely will need to be cleaned up as soon as the domain has been bought.

Despite what some similar articles might say, having a glance at the back link report from link graph companies such as majestic SEO and Moz, or aHref, will result in little real insights being gained as to these types of issues, while all of these providers including Google webmasterTools show back link data, much of it will be nofollow or no longer in place, along with thousands of site-wide links and other erroneous lines of data which will need to be painstakingly and manually crawled thought to get any idea as to whats really going on… The resulting potential for mistakes are high if you don’t know what your looking for.

The best aged domain is one that has got 10 years of age and hardly any links, of course that’s very unlikely but essentially thats what would be the best aged domain, something that has never engaged with SEO and the only links it has are completely natural (despite their perceived quality as its actually normal that low quality directories link out, along with many other low quality resources which scrape the internet looking for companies to list with other data such as name address, phone numbers etc. such links are generally not a problem to your potential new business but importantly, its the way in which they link (the types of anchor texts used) and the quantity of toxic links found that may end up impacting your new domain which is the problem.

Of course there are other potential issues to be wary off, such as negative SEO, this is the practice that some rogue competitors have engaged with to harm their adversaries by initiating toxic links which have the appearance of being gained for SEO purposes and thus when found by Google’s sharp teeth have the effect of sending the targeted companies into the doldrums never to be found again.

To avoid making the wrong decision in buying a new online business, doing a professional full Link Audit should be the most obvious first step when looking at a domain if you don’t want to risk loosing your investment, coupled with a full analysis of the traffic coming from Google currently, also looking at the current position reports for the domain (these are the keywords you would want / expect traffic from and the current positions in Google for those terms), finally you would want to do an acid test, this is a process we often use to check that a domain is responding correctly to new links by linking into various pages within the site for various keyword terms to understand how those pages respond to healthy links and to better understand how it is actually responding to a range of keyword terms both competitive and non competitive.

While such techniques are not straightforward tests that you can simply carry out within a few hours, they will take some time to perform and for the results to be seen, but if you’re investing $100k(s) of dollars into a domain, taking the time (a few weeks) to do such tests and reports would be time and money well invested (these tests / reports can be done for less than $350).

Twitter For Business

Hacking Twitter for profit and ‘viral’ growth

Here at The Link Auditors we play with lots of data, this gives us insights to business opportunities that others might not see. One of the latest toys we have been playing with is hacking Twitter to bypass their payment gateway while getting through to new customers.

Imagine if you could have access to all of Twitters user data, to then understand and drill down through all of the businesses that are listed on Twitter and filter out the ones relevant to your business, then send them all a direct email – introducing your relevant service! Useful ha?

Finding new clients online is the key to new business, doing it cheaply is the secret, sure we can all pay Twitter 1$ for every tweet that has an some engagement, or pay Google 1$ for every time someone clicks on your add, or the same to FaceBook, all of which dictate you can only use about 150 letters and none of them give you the data outside their platform so you can try different messages etc. Ultimately they become the gatekeepers to new customers and businesses are left reliant on paying them and adhering to their rules within their systems.

As a result if we want any autonomy we are left to our own devices against these big guys systems and one such method, SEO, has been a valid method to cheat the ‘Google’ system of Adwords, though with recent Google’s wars on web-sites ‘cheating them’ and their link penalties, companies are looking for alternative methods of getting new customers. It’s not wrong is it? Well only if you do so and don’t pay the old gate-masters, according to them of course…

Anyway, moving on to harvesting Twitter data for profit and bypassing their payment gateway. The great thing about this big data stuff is that it gives us access to data compiled in ways never seen before. Imagine if you want a list of all the Spanish restaurants in the world? How much would that cost? First off, the traditional directory companies such as Thompson local et el are normally unable to give you such specific data sets, they might have a list of restaurants but you wont get a spanish subset off them. This is where Twitter comes in to its own. Using big data and the information companies have shared about themselves we can scan and group the specific companies were looking for like never before.

Once we have their Twitter profiles where they share such information we can scrape their websites to ensure they are a good match for our criteria, then get their contact details such as email addresses and contact forms, ready to send them all messages, automatically even… cool ha? Well not if your Twitter and trying to charge businesses to use your platform for the pleasure of sending 140 characters while ‘reaching new customers’ with a tweet. Anyway were not, so I guess its still cool.

Looking at the future of this big data stuff and the technology needed to harvest, filter, scrape, and then reach out too extremely relevant people in your target sector, bypassing all the middle men, I can see the attraction for many companies. Right now were beta testing the service and welcome inquires for testing and using this new method of reaching out to your specific target markets. SEE TWITTER FOR BUSINESS for more infomation

Start Up Advice – Recent Resources from different perspectives

I believe that being involved in startups and starting new companies by delving into the unknown can be one of the most challenging and rewarding ventures that we can embark upon in our lives, startups can be more rewarding than actually becoming rich!

Over the years however, there has been many great insights given from other entrepreneurs who have traveled these routes (we all like to help others by sharing our perspectives and knowledge) and these are constantly advancing.

Often enough however, those apparently lauded as successful are also plagued on the inside with doubts, depression, chronic stress, regrets and sadness, while at the same others maybe believing those same people are living the dream.

One such case is of Rand Fishkin, Founder of Moz, who recently posted on his blog a harrowing account of his own depression and a most interesting read caused no doubt by years of struggling to achieve the sometimes impossible task of keeping the machine expanding all the while promising to staff that they will keep their jobs and to investors that they will get a healthy return. Such a burden for an entrepreneur in the center of a vortex that sometimes takes on a life of its own and becomes unmanageable or perilously stressful.

Another recent and very interesting resource is set of lectures being put out by Y Combinator (a fantastic idea and incubator of entrepreneurs) who are currently putting out a serious of lectures with Sam Altman (among others) in an attempt to spread some of their wisdom that they pass on to the young startups that go through their programs as they tread tentatively towards the unknown path towards riches and building a startup.

Now all the while there is some fantastic advice being shared, and those serious about running startups should sign up to get the lectures as they come out!

But in contrast to Y Combinators basic stance, (that to be successful, startups need investments) this video also from Stanford is going to throw some cold water over the ‘investment’ method of getting to the ‘top’, and examine this fallacy that has been wrongly accepted as correct by many, namely that in order to succeed you need to raise capital through angel investments or venture capital. Instead David Hansson of 37 Signals argues that lifestyle business opportunities can be enjoyed more where entrepreneurs can become wealthy and successful without touting for investments, instead by bootstrapping and personally building their startups and focusing on profits rather than eyeballs and investments.

And then there’s the recent book by Gabriel Weinberg and Justin Mare called Traction in which they interview other illuminated entrepreneurs who then talk and examine how they got traction and growth. This concept has recently been referred to as Growth Hacking in which developments take place as new technology and methods of using big data and social connections, literally allow developers to hack growth into their company success. Many of the ideas that first work have somewhat of a shelf life as they sometimes exploit new technology such as Hotmail or Facebook etc. that results in very early growth that then gets closed down by the companies facilitating the growth since they too wish instead to exploit and capitalize on these sometimes great but exploitative business models.

Hacking growth is something every business would like to do, most of us dummies can build a business and any retard can advertise on Google, Facebook, or even now Twitter to try and gain new business growth, but who wants to pay these Goliath’s who’s aim is to suck out of your business all the profits available. If you want to learn about the latest Growth Hacking possibilities we are exploring, you might want sign up to our email list as we will be sharing some further insights in our next email post entitled Hacking Twitter For Fun and Profit!

Subscribe for updates:

Hacking Growth is what SEO is all about and what most of our customers have been involved with. That’s the model of hacking Google’s natural results and getting your business in front of the multitude of people searching Google for the products you sell, and not paying Google extra enormous amounts for the privilege. Sure Google go to great lengths to demonize the crowd that they catch who are engaged in hacking Google for growth and while many have become fearful of the Goliath’s long arms and lack of leniency or mercy when dealing with growth hackers, the pursuit of gaining traction for your business remains the most important challenge for any startup or entrepreneur who should never be shameful of getting caught in the process of trying to do the best thing for their business in a rough world where the Goliath’s are big enough to eat any competitor in their way.

edit update

Check out this other book called traction.

Moving forward after a Google penalty

One of the big questions people have is what should you do following a penalty.

The first step is to completely recognise how and why you went wrong in the first place. You will often hear people who thought they knew a bit about ‘SEO’ arguing as to why they believe the quality of their own links were OK, justifying them as good links when clearly they are manipulative and most importantly very easily detected as such.

Google have watched every trick in the book and they know all of the easy / lazy tricks people have used. Such people commonly believe that it’s ok to interlink between your own sites or to use ‘high quality’ directory links like Dmoz or Yahoo (both these resources can be considered as spam in today’s world) and are just some of the mistakes people made which they have a hard time understanding.

In truth these are easy mistakes for Google to detect and not the sort of authentic links that Google reward.

To achieve an authentic link profile in today’s environment is harder than it’s ever been, but at the same time ‘going straight’ (giving up on SEO) is like throwing out the baby with the bath water and is not always practical.

The key to winning in today’s market involves recognising a number of truths:

  • 1) you should always be playing with multiple domains.
  • 2) you should be gaining links that appear to be completely authentic.

    What does an authentic link look like?

  • 1) links should be imbedded within articles
  • 2) there should be other outbound links going to relevant and 100% authentic resources
  • 3) big money anchor texts shouldn’t be used. Instead use brand and white noise as anchors.
  • 4) the blog where the article is placed should be relevant to your
    target site.

  • 5) other articles on the blog should be authentic and link out in an
    authentic way to sites which are generally healthy.

  • 6) articles should be well written, visually stimulating , including
    images and good formatting.

    How do you find such links?

    In an ideal world we are told that writing great content will attract great links. In reality unless your writing about very desirable subjects or given something expensive away for free, you will struggle to gain any authentic links. That’s the harsh reality in which we live, which in turn means that without manipulation you will flounder in the ‘organic’ results no doubt to others who are successful getting away with blatant spam and thus you will be forced into paying for Adwords (just where Google want you).

    If you pay for Adwords you SHOULD also be paying for SEO and have a strategy in place (you have nothing else to loose).

    Of course SEO has earned itself bad rep as Google has gone to great efforts to punish everyone they catch (up to mischief) that said while the goal posts have moved, the goal will always be there.

    Decoding Google has been a long standing challenge and goal for The Link Auditors but our knowledge and understanding of Google grew out of our own SEO service which today is one of most authentic, elegant and effective SEO services in the world.

    If you want to discover a completely authentic and elegant article based link building service then get in touch with us and let us impress you with the most authentic links you have ever seen!

  • Should You Edit Your Over Optimised Links?

    Something that has come out recently on SERoundTable is that Google
    most likely look at minor edits in links as a signal to evaluate

    It’s been often observed that Google will lower a sites rank that has
    just gained a new link when they have suspicions over the site being
    linked to, in this case if someone panics and edits the link Google
    can confirm that it was suspicious and the site stays lower, if
    however the link stays the same often Google will give the benefit of
    the doubt and the ranks return higher (don’t edit links if ranks

    HOWEVER, in this twist, it is thought that Google will consider links
    that have been edited as lower trust if the edits are to anchor texts
    and were not previously suspicious. So an anchor text such as ‘click
    here’ which had a positive effect on the ranks that was later edited
    to ‘cheap ***spam*’ would be highly suspicious.

    But that’s not the going to be the normal case so where edits take
    place and most likley people edit things that are already suspicious
    (trying to fix mistakes):

    Example: Someone places a link into an article then a few weeks later
    noticed a mistake (over optimization) and changes the anchor text to
    something less obvious, in such a case Google probably never trusted
    the link in the first place anyway and so these minor edits only
    confirm their suspicions.

    However there is another consideration in general to consider, from a
    technical POV there is a cost to Google to analyze pages, obviously
    they don’t want to reanalyze pages which have already been analyzed
    and the data factored into their results (they wouldn’t get anywhere
    if they did that) therefore it is thought (by me) that pages have
    finger prints and when Google see the page again, if there is a small
    change to the finger print it wont bother trying to work out whats
    changed, any minor edits therefore will only be picked up in a few
    months time when the next big update happens and they recalculate
    everything again.

    The result of this (1) minor edits take a long time to be noticed
    anyway and (2) coupled with the likelihood that Google may consider
    this an indication of spam, it is probably even less attractive to
    make minor edits to anchor texts.

    My recommendation would be instead of making small edits to links when
    mistakes such as over optimization have occurred, it would be better
    to make instead a completely new article, completely scraping the old
    article to start again.

    However, that does not mean no edits should take place, you can and
    should still edit articles and links to improve them especially before
    Google have seen them, just bare in mind that when editing links
    Google can use this an indication of spam and therefore try not to
    make mistakes in advance!

    Recovering from making the wrong SEO Choice!

    There are many factors to consider when choosing a SEO company to help you navigate the terrain in today’s hostile environment, especially when Google is dishing out penalties for one thing or another every day to manipulators of their search engine. 

    The history in this sector is catastrophic, as a result many are looking to change their SEO company after being penalized as a result of the work that their former SEO agency performed. Never, in any other industry I’ve been involved with before have I seen so much collateral damage caused as a result of such piss poor work. I’ve found a complete lack of understanding & incompetence, all coupled with an unhealthy dollop of arrogance, as I have in looking at what these so-called ‘experts’ have done for their former clients. 

    At The Link Auditors, we are at the revival end of when such projects go sour, people often come to us only when they have literally ruined their domains, the penalties they are plagued with are often difficult to remove and sometimes impossible, not surprising really with the sheer number of toxic links they have.

    The typical defenses some SEO ‘experts’ often give is: ‘that was what used to work in Google’, this is not only untrue but throughly ignorant, there is no justification for their lack of understanding, they were simply incompetent. Google have for many years (since 2006+) been rewarding relevancy and competency, little of what we see in these reports is that.

    But why do companies make the wrong decisions and employ these incompetents in the first place? It’s an interesting phenomenon where I’ve watched companies go down the wrong road despite their being pretty obvious signs that should have alerted them. So what were the ingredients that led them follow the wrong road in the first place?

    From what I can see, it’s a combination of lack a of knowledge on their side, but of course thats why they employ ‘experts’ in the first place, coupled with a few other elements that all add up to the fatal choices that one day result in these harsh Google penalties.

    An Industry full of charlatans: as a result it’s pretty hard to distinguish companies from one another since they often emulate each other, that said you can often get a clue by looking at their work and the first place to look would be their own rankings and websites back links. Sure not everyone can do a full analysis on a companies back links and it does cost money for such reports but not a lot and you know it might will be the best £200 you ever spend! That said the best place to start would be a full report on your own website and in that process you will learn more about poor links and SEO than you will by listening to ten such experts!

    Lack of knowledge: not knowing or having the right data to make the right decision, this can only be solved by spending some money or having the time to really understand your own sites back links and other issues.

    Not recognizing competency: not being able to tell the difference between companies when they sound very similar.

    Focusing on the wrong signals of competency: when a company puts together fancy looking sales pitches with lots more waffle than substance, one should be on guard, this is not about promises but it is about finding the right partner who will stick with you for years. Thus comparing companies based on cost or short term goals instead of core competencies.

    Not calculating the risk / reward relationship correctly: people become a lot more skeptical after receiving penalties and often this paralyses them from taking any further steps, once they fully grasp the gravity of the risks. However, it’s healthy to take some risks, the key is to be fully aware of them!

    Being married to one main domain: limiting your online presence to one main domain (your favorite) is a recipe for disaster. It takes time and energy to learn about this process and mistakes always happen, most often the big mistakes are made on the main domain plaguing them with issues for years into the future. Then instead of pivoting to a new domain or having a bunch in the pipeline, they get suffocated by the issues they are facing and see no way out. The most common reason for this is because they spent too much money on the first mistakes but the hardest part with any site is learning what you need to learn, that said, this is the internet were talking about and the second and third sites are often not a lot more difficult than copy and paste (don’t do that with content mind).


    All in all these are a few of the ingredients which lead to mistakes and ultimately penalties. But any company who relies on the natural search results for traffic has to make critically good choices. If you want to talk to someone decent in the SEO world, get in touch with The Link Auditors, our tools speak for themselves and came out of the SEO work we do. No one else in this sector can boast of having the same knowledge, tools, or skills!

    Understanding the softer Panda

    Every year Google set off alarms bells, sirens and roadside bombs, disrupting the search landscape. This disruption is a concerted effort to confuse, disorientate and devastate those who they believe have in some way benefited ‘unfairly’ in their search engine (and not paid them for the privilege). Regardless if those who they affect are the best matches for the search queries or offer the best services in their sector, nor do they seemingly care much about collateral damage or domains affected by negative SEO which is Much Easier Now!

    This righteousness is enforced for the most part algorithmically and refactored about once every year in a major way. At the time of each major update, they normally throw in a few other minor changes so as to be sure no one affected really knows what exactly happened, why or when! Its been an effective model to challenge and tame the psychology of those people tasked by companies to effect some positive change in this competitive arena.  

    With jobs, companies and livelihoods on the line, the last thing anyone wants, is to have fingers pointed at them, accusing them of falling afoul of Google’s terms of service. As a result most have adopted a defensive approach. This is where first and foremost they think of their own jobs and playing it safe, this results in mundane efforts achieving mediocre results. Better for the company to die slow rather than anyone being able to accuse them of being underhand and manipulating Google.

    When companies get ‘hit’ they can’t sue Google, so they occasionally resort to suing the SEO agency, but when Google get it wrong and hit people unfairly they simply retract sometime later (16% Claim Recovery After Google’s Panda 4.0 Update) if your lucky ha… without any apologies and for the most the part, the people WRONGLY affected simply thank them for the recovery (even if the penalty was unfair). Google won’t admit applying algorithmic penalties despite everyone wanting to know 70%+ Want Public Disclosure Of Google Penalties, because they would be inundated with complaints and reconsideration requests! It has come to this quite simply because most domains have some sort of penalty because that’s just the way Google’s algorithm works!


    In this new era of SEO, we have seen the focus change away from the inbound links (PENGUIN) back to the realm of ‘on page content’ (PANDA). But before you rewrite all your on-page content, know this important but largely ignored or unknown facet. It’s not about the content so much as its about the outbound links within that content!

    For many years I have been saying to my clients they can gain a couple of positions by simply adding some decent, relevant outbound links to the content and pages they are competing for, these subtle and easy changes can give you a couple of positions when you are already on the first page of Google! But this is almost counter intuitive to most people because they are still conditioned to think that Pagerank (outbound flow of juice) would be lost as a result of adding links, not only that but they consider outbound links helpful to others (if they paid for the juice no point given it away free), lastly they know they can get penalized for linking out, as it could appear that they are ‘selling links’ (Google forbid).

    So what is Soft Panda?

    Is it about ‘thin content’, specifically content that’s not very original or where there’s not very much of it, or is it caused by a lack of outbound links to other relevant resources? Lets ask Matt Cutt’s. Matt what do you think of content without outbound links?

    Matt Cutt’s: “Google trusts sites less when they link to spammy sites or bad neighborhoods, but parts of our system encourage links to good sites”.

    Listen carefully to this small but telling nugget of information, however, it does not inform us of what sort of places you should be linking to which can actually benefit search positions! Most people wrongly assume that when linking out, one should be linking out to other authority places. These being the easiest to find most often linked to pages on the net, prominent government sites, the Wikipedia, etc. However these pages add virtually no value in Google’s “add value to the debate” algorithm. Instead those types of links could be seen as lazy linking by adding nothing new to the debate!


    Their algorithm is looking for interesting and relevant places which it hasn’t seen links too often before, pages with the fewest number of links, certainly pages that have less than 100 or so other inbound links and pages which have no other links are even better (provided they are not spam).

    And what is spam?

    Spam is indicated first and foremost by the anchor text, the higher the monetary value that an anchor text may have according to Adwords, the more likely it is to be classified as spam! When linking out, natural links should not use high money terms in the anchors, and of course when linking to a page make sure that page has itself outbound links that make that content valuable too!


    The changing role of the ‘SEO engineer’ has become as complicated as the role of the best engineers working at Google, who look at, de-compile, analyze and build systems to protect against every form of manipulation they can scale against.


    If your SEO agency has not performed a FULL and extensive analysis of your backlinks then I would seriously question their ability to advise you on anything (as they simply do not know themselves).

    If your SEO agency has not performed any link removal work on your behalf and simply instead disavowed a few bad links then again they are most likely performing a mediocre ‘minimum viable service’.

    If your SEO agency refuse to engage in link building on your behalf but instead have offered to build you ‘natural’ social media links with the aim of helping you ranking positions then you need not wonder why they have not helped your ranks. Ensuring you have good natural looking links today is as integral as anytime in Google history.


    This is a risky terrain, you’re playing against many other people who are also managing the same risks. But more importantly, you are playing against Google (the house) who always win regardless and who could not care less.

    There are companies who are being penalized but who have never bought a single link, or manipulated anything, instead they are caught up in Google’s war on spam, they suffer for many months while they negotiate with Google spending sometimes thousands of pounds trying to understand what struggles they face.

    There are also companies who invest everything they have into their main ‘brand’ domain and then get hit with a penalty, but are unable to shake it loose nor shake loose their attachment to that domain, hence they go down with the ship.

    Risk has always been at the threshold of page one results, but today the job of the SEO engineer is to understand that risk and not be paralyzed by it. To know what the real risks are, where they are located and know how to manage those risks, to know what is acceptable risk, to know when to give up on a project, to know how to revive a project, and to know how to win in today’s terrain by manipulating Google in the most elegant and safest way possible!

    Softer Panda – Harder Penguin?

    It seems from all reports so far that those who recovered in this last
    weeks (Softer Panda Update), for the most part were suffering
    originally from Penguin 2 and following it’s introduction on May 22
    last year when they first started to see drops which continued until
    this update:

    So it’s confusing, What was the cause? Penguin or Panda? or are they

    Matt recently said about the Softer Panda:

    ‘We are looking at Panda and seeing if we can find some additional
    signals, and we think we’ve got some to help refine things for sites
    that are kind of in the border zone, the gray area a little bit, And
    so if we can soften the affect a little bit, for those sites, that we
    believe have got some additional signals of quality, that will help
    sites that were previously affected – to some degree.’

    It is hypothesized that the ‘additional signals’ are from Penguin
    (back links), and that if sites have been flagged as having low
    quality links, then the quality of the content would also be seen as
    suspicious. Leaving the only logical question, is Penguin affecting
    Panda or is Panda affecting Penguin.

    The next question is what will happen when Penguin 3 arrives next

    Something is brewing!

    There was lots of reports over the last few days with people talking about penalties finally being lifted, it seems at least from first reports that all these Disavow Lists that people have been uploading are finally being noticed and considered.

    It’s early days and reports so far seem to claim it’s penguin related. HOWEVER, its about time for Google to make one of their major changes as they normally shake up the SEO world at least once every year, with their ongoing efforts to confuse and create fear among the players involved; making sure no one becomes too complacent with thinking they have beating Google just yet.

    Given Google are not making any announcement (typical of when they do something NEW) it could be …. something new. So what could it be? There was some talk from Matt recently suggesting that they may introduce a ‘SOFTER PANDA’.

    OR, it could be something new, but it seems there’s more reports of recoveries after a long hiatus following penalties, rather than those reporting they have just been smashed, so it sounds more like Goggle have just connected the disavow cable to the results.

    Let me know what you’re seeing…


    Disavow tool only 13% reported it helped


    A poll from SERoundTable suggests strongly that the disavow tool is
    useless for the majority of people suffering with penalties:

    Of course anyone on this email list will be only too aware of that
    since we have been saying for many years that the disavow tool is only
    successful when negative SEO is involved and generally instead bad
    links need to be actually removed.

    DO YOU KNOW! We have the best bad link removal tool in the world!

    Not only will our tools find the bad links for you but they will also
    find the contact details, then post messages to the webmasters using
    the contact forms demanding action. This tool has to be seen to be
    believed, and every good SEO Agency worth his sort will use our
    Automatic Link Removal Tool if they have any sense!

    Getting a manual penalty removed is not the point!

    Getting your traffic back is the point!

    Many ‘professionals’ in this sector chalk up their successes based on the number of penalties they get removed, the criteria being when Google write to the site owner telling them they have lifted the manual penalty they call this a success and the end of the task!

    But getting a manual penalty removed means nothing in this game! Getting a domains traffic back is the only metric that affected companies should be concerned with!

    We know many agencies often use minimum efforts and techniques to achieve penalty removal and ‘success’ but they don’t care about the aftermath of their work, nor do they care about the health of the site following the removal of the penalty! Instead regaining traffic is considered a separate unrelated task for which they then duly charge for as another service.

    The Problems They Create:

    Firstly getting the penalty removed by disavowing almost everything is one of the biggest mistakes they make, this creates a situation where all the healthy links are also disavowed and hence damages the site in the process, they use a machete instead of a well tuned laser to find the bad links and disavow them specifically.

    Secondly they don’t do the hard work of actually removing the links, instead they rely mainly on disavowing links. Removing only a small portion of the bad links will result in further problems down the line and not help with algorithmic penalties (of which there could be any number in the pipeline ahead).

    Google Do NOT TREAT Disavowed links as removed links!

    Thirdly, algorithmic penalties will not be lifted as part of a manual penalty removal. That said if you don’t remove the problem links and deal with the problem properly you will still be suffering with the automatic algorithmic penalties, which do not show up as a manual penalty but have the deadly effect of making sure you don’t ever rank for your most highly prized search terms and you won’t even be notified about the problem!

    Regaining traffic is the only criteria for measuring success!

    When clients come to us, there are many instances where we will deal with penalties by not even doing a reconsideration request! Getting a manual penalty removed is not our end goal, we instead want traffic recovery but by using the safest options for that client!

    Important things we consider first:

    We first look at the other problems a manual inspection might create, manual reviewers will consider other sites connected by IP or that use shared webmaster tools accounts, or share the same company details or address, these other sites will then be looked into and penalties can often be spread across other domains as a result of manual inspections that take place at the request of people doing reconsideration requests.

    If a client has nothing to loose then we will do a reconsideration request, but if it will shine a light on a network of other sites, we will instead opt for getting the penalties lifted by simply removing all the negative links manually, this is a very quiet process that involves only us and the link resources, and when the Google Bots see these links removed the penalties will be lifted automatically anyway. (Ranking penalties often expire in a set time (if the links are removed) and are not lifetime penalties!) This may seem to take a bit longer than simply getting a notice from Google saying the penalty has been lifted, but it’s often the quickest way to regain traffic with or without a successful penalty removal notice!

    Successfully Recovering a Domains Traffic is the main purpose of conducting Link Audit and going through the link removal process!

    “myblogguest” Just Hit By a Google Penalty Peniliesed

    It’s breaking out on Twitter now that owned and run by Annsmarty has been hit with penalty and is no longer ranking for its own name. This service has been growing over the years and must have been very nervous following the announcements recently by Matt Cutts (and us) about the precarious nature of the Guest Blogging ‘Industry’.

    So far its unclear if Google have also penalized the sites within the network who were using the platform to connect with ‘publishers’ and SEO companies otherwise named ‘inbound marketers’ who coined the term in a quest to move away from basic link building to a more ‘advanced’ white hat friendly service,


    However, whoever honestly believed that ‘Guest Blogging’ would not turn sour had their heads in the sand. Clearly its manipulation according to Google and Google’s war on Spam as they term it has been ramped up over recent years as well as being made public. According to Joshua Geary “MyBlogGuest was selling links in the open on the lowest of the low quality blog sites. This hit by Google is just the beginning and is designed to break the spirit of guest blogger spam networks around the web.”. Prior to this Google was engaged in a silent war where they penalized companies in the dark not telling them that they were penalized because of their link profile.

    Writing On The Wall

    As long ago as January there have been some who were predicting the death of MyBlogGuest and pulling them up on their policy of insisting that ALL participants had to use FOLLOW links (else they would be blocked) despite their public announcements to the contrary!.


    Right now, and following these public declarations by Matt Cutts that Guest blogging is no longer acceptable, if you (OR YOUR SEO COMPANY) have engaged in Guest blogging you will need to take a close look at your back link profile and closely monitor the health of the links pointing to your companies web sites now and going forward into the future.

    Conducting a Back Link Audit

    We have a bunch of tools designed to find links that have recently going sour according to Google. These tools have been developed over the last 6 years and were all based on specific penalties that Google have developed and implemented over the years. If you’re suffering from a penalty now or have been engaged in some of these practices that have now gone sour, then clearly ACTING Fast will help prevent and limit any damage to your business before its too late and Google block all your traffic too! Sign up and do your own link audit for just £200!

    More rubbish from Link Research Tools

    There’s a big backlash brewing against ‘Link Research Tools’ (LRT) recently and this is tarnishing the link auditing industry in general. In defense of the ‘Link Audit industry’, I’m going to give some background and then argue why NOT ALL LINK AUDIT SERVICES ARE THE SAME.

    Recently where LRT conducted an UNREQUESTED case study into a PAY DAY LOAN company that made the SEO headlines recently because Matt Cutts was involved. LRT spent 2 days compiling a ‘deep dive’ into the company and argued that with their tools they found more negative links which had been missed by the woman doing the audit who they called ‘some random SEO who had just given up doing link wheels’. Arseholes!

    Ensued in the comments with many people questioning their data and findings in which the director Chris Cemper had to get involved but instead of admitting mistake only made it worse by confirming their own ignorance when he defended the report and again claimed that nofollow links were correctly flagged causing the site to have the penalty.

    Was called, numerous people pointed out this was nonsense and that Google had repeatedly been clear on Nofollows NOT being harmful. Then LRT stared deleting comments. Arsholes.

    Flagging links as false positives, or suspicious is what LRT are guilty off and this causes so much noise when using their tools that it makes them valueless, not least because all data has to be treated as suspicious and checked again manually thus defeating the object of using their tools. Worse yet when people hear this they assume all automatic tools are useless!

    In most cases its simply too big to manually inspect the back link profile of a domain, its not impossible but it can take teams of people who often get it all wrong anyway. We have all seen letters from people requesting to have links removed because they think they are harmful when they are perfectly safe and natural. Well this is what happens when people use the services of people who dont understand what Google is trying to tackle and the measures they take such as LRT stance on nofollow links & scrapped links.

    Doing analytical research to find bad links by hand is very time consuming and costly, hence the natural tendency for companies like ours to automate the process to save our clients time, money and hassle. The bottom line is that this can be gotten wrong both when carried out manually and algorithmically, but in general some companies are guilty of either getting it all wrong so badly that they damage the perception and reputation of the industry:

    Removing links is hard, finding bad links is hard, dealing with negative SEO is hard, but there are good companies out there who are not rude, arrogant and who understand the real problems companies face and do not have mistaken beliefs or valueless services, namely ourselves and I implore all companies to try our services at least once (tip some of our tools are free) and not judge all automatic link auditing tools with the same brush that LRT are painting themselves with! gets it all wrong

    What’s a bad link?

    Google have gone to war on paid links and unnatural links for the last 2 years, following this we all live in fear of bad links.

    • Could a link be part of a negative SEO attack?
    • Could they be harmful links considered as manipulative?
    • Could they be holding back search ranks?

    These are the legitimate questions that company bosses need to ask when dealing with back links to their sites now. As such EVERY company should monitor their back links regularly to avoid these costly false penlites costing their companies serious lost business for months on end while any false penalty due to mistake or Negative SEO is straightened out.

    What’s a good link?

    When we got an email from asking US to remove a NATURAL and HEALTHY link to them from our site, I WAS NOT ONLY SURPRISED BUT shocked. Since this is a company who are in the business of ‘online management’ which claims they can “Improves How You Look Online” and “Clears Negatives. Enhances Positives”!

    Whats a false positive?

    WOW so whats happened here is a company who’s mission it is to improve how companies look online (I would suppose) and that includes search positions (I would hope) have clearly miss classified a natural and positive link to their site as harmful and requested it to be removed (as they fear it may harm their SEARCH ranks) when its actually helping them, and as such they not only wasting their own time, but more importantly our time, along with their SEARCH Ranks and clearly employing people to tackle this task of monitoring their back links and no doubt their clients back links who JUST DON’T GET IT!

    From: Legal Dept []
    Sent: 11 March 2014 20:38
    Subject: Link Removal for

    Dear Webmaster,

    I hope this email finds you well. As the webmaster of, I appreciate you linking to our site. Unfortunately Google does not feel the same. Therefore, I am working towards cleaning up the links in an effort to adhere to the Google Guidelines more closely. That includes links that have been created on your website

    Please remove all links, pointing to any page on; specific pages that we have found linking to are listed below.
    I would appreciate your immediate attention on this matter. Please update me as soon upon completion so I can update my records.
    Thank you and please respond if you have any additional changes. Webmaster/legal

    Percentages of Brand Links, white noise, keywords


    Be careful who you employ when you monitor your backlink profile, they or the tools you’re using clearly don’t get it. CAN I SUGGEST INSTEAD you use OUR tools and services which are designed to highlight REAL bad links and manage this task of cleaning your BAD links only, elegantly and smoothly. IN FACT I will offer you to do it all for NO MORE THAN £200 (that’s our standard price for everyone who wishes to use our tools).

    Updates and Negative SEO tool launched

    Couple of interesting posts recently about Google and spam,

    Firstly their announcement that spam action can follow you around onto new sites and that if you try to get out of a penalty the Nazis at Google will try to slap any other sites you launch.

    In related news Google suggest that they will also look at your other sites when applying penalties.

    The Google Nazis will look for any unique identifying numbers such as company numbers published on the sites or telephone numbers that they can use to tie domains together along with any other metrics such as Adsense codes, other ad platform codes, or shared Webmastertool accounts and domains on shared IP’s.

    When trying to sidestep a penalty on a new fresh domain, it’s clearly important to remove all association elements that can give the game away. It is also important to rewrite content and ensure that the content on the new domain is not a copy of content on a penalized site making it difficult to take a penalized domain and switch to a new one without plenty of work.

    None of this is new news to us, we have been telling our clients this consistently for many years, the only thing that’s new is its now confirmed by Google!

    If you want help to sidestep a penalty then get in touch, don’t risk doing all the work to build a new website without discussing our aged domain finder service first, this is where we help clients bypass penalties using aged domains that can quickly recover lost traffic and bypass sandboxes.


    We have some amazing tools and one of them is the new tool to monitor for negative SEO. Actually this is an old tool for us but we have not made it public until now.

    Our Negative SEO tool is the most powerful tool on the market to protect against NEGATIVE SEO. This tool will highlight any new links, compared to when it the tool was last run. It is also a fantastic tool for monitoring your back links, anchor text and the amount of juice pointing to a domain over time, keeping the most detailed and accurate historical record of links pointing to a domain which can be correlated to changes in SEARCH POSITIONS. It’s also very useful for analyzing any problems that occur so you know which links caused changes. This tool should be run every month, compiling detailed snapshots of your link profile. Cost to run this tool is £29 GBP.

    To use this ‘new’ tool log into your account here:

    Kind regards


    The disavow tool is only for negative SEO

    There has been a long standing mistaken belief that it was possible to only concentrate on finding bad links and then disavowing them.

    Last year we posted an article on our blog “to disavow or not to disavow, that is the question“. In That article we said the same thing, that In all our experience we see that to remove links is the only way for success.

    Recently google have been sending more informative emails in response to unsuccessful reconsideration requests and they are making it crystal clear that links need to be removed.

    Such as this example :

    Please correct or remove all artificial links, and not just those provided in the examples above. It may be that you need to contact the webmasters of sites with these artificial links. If links to your website can not be deleted , you can use the tool repudiate links. Note that simply disavow links is not sufficient for a review request succeeds. You will also need to demonstrate good will and remove the artificial Web links whenever possible. Remove links takes time. Due to the high number of applications we receive and to put all the chances on your side for your next review request to succeed, we will not process any application for the site over the coming weeks . We advise you to take the time necessary to remove artificial inbound links before sending any new request for reconsideration.

    Google are making it crystal clear that you need to remove all bad links. The disavow tool is ONLY FOR INSTANCES OF NEGATIVE SEO!

    Who’s vulnerable to being outed now – ‘Google Blowing’

    Google have created a new form of corporate threat! Companies are now vulnerable to being outed for GAMING GOOGLE. Gaming or manipulating Google’s search results can lead to a penalty from Google which will prevent the company in question from ranking in the top positions of the search results.

    Recently Expedia was outed for manipulating Google’s search results by a whistle blower [link] who was unheard of prior to his public outing of Expedia. He simply set up a blog and reported the tricks that Expedia had been involved with in gaming Google. Whether he was an insider at the company is not known. But this subsequently caused a scandal which affected the company share price and resulted in Google penalizing Expedia’s search rankings which caused a massif search traffic drop to their website [reported by search metrics].

    Clearly a harsh lesson for Expeida, but what about the many others who are vulnerable to the same type of whistle blowing (Google Blowing) where someone can examine publicly available data, find and compile evidence the company manipulating GOOGLE and then report this (anonymously even) on some random blog and watch the firestorm wipeout the targets organic Google traffic, thus resulting in massive looses for the company in question.


    This type of threat we are labeling ‘GOOGLE BLOWING’ and is a new form of corporate sabotage that can be undertaken against any company who has been involved with ‘search engine optimization’ (SEO) previously, which involves the act of trying to push up their search rankings by placing artificial links to their target sites.

    This ‘SEO’, has been a typical practice for most online companies in the last few years and more recently Google has been heavily penalizing companies that have been caught out by their algorithms for having created an unnatural link profile.

    Clearly there is a high level of risk and paranoia since the vast majority of companies online have engaged in this type of manipulation along their journeys, leaving them now prone to being caught by Google’s algorithms which are designed to automatically detect and penalize such behavior, or having a third party exposing their dirty laundry to Google who has the knowledge and insight to investigate and expose manipulation.

    In this environment companies who are proficient and understand these risks are going to great lengths to now remove any harmful links that may have been engineered previously, before they get caught out by a dredded Google penalty. So much so that specialists are being called in to detect the harmful links and then request the removal of such links before someone else finds them.

    It’s a huge race to protect ones organic (free) search traffic and those who are not proactive are vulnerable and wide-open to this new form of corporate sabotage. There will be no doubt many more instances of Google Blowing over the next few years and Expedia will not be the first or last Internet giant to succumb to such an attack.

    So which companies could be next to see their manipulation exposed by a Google Blower? It’s a relatively inexpensive process to algorithmically examine the links to a website (we charge £200 pounds to locate the bad links) and there are many companies who have a vested interest in exposing any wrong doing by their competition, who will be tempted to then report those to Google and gain higher positions themselves.

    Negative SEO attack “LinkDetox” on

    The attack!

    On the 10th of January we discovered un-natural links pointing to us from mainly Russian domains all with the same keyword “link detox“. This looks like a deliberate attempt to get us a Google Algorithmic penalty (Penguin) for over optimization for the keyword ‘link detox’ so we no longer rank for this search term. (currently 3rd in the natural listings).

    The sites where the negative SEO attacks are coming from are typically .bg, .su, .lv cc.TLD but mainly from .ru top level domains and are extremely smelly forum comments on soon to be banned domains.

    I wanted to make people aware that negative SEO is on the rise and even our business is being targeted by some unsavory and very smelly characters. Normally negative SEO is carried out by ones competitors who will try to damage your organic traffic by getting your site penalized for search terms that you rank for by tripping Google’s algorithmic penalty (Penguin). It is unethical behavior bordering on illegality. In this case we can’t be sure it was carried out by our direct competition (link detox) but we have our suspicions! However, obviously with our tools and experience we detected this negative SEO attack straight away and have elegantly and swiftly handled the situation protecting ourselves from any Google penalty. Here’s what we found out during our investigations.

    Information about the one of the IP’s that was involved with creating the attack.

    How did we spot this negative attack so early?
    What we are doing about it?
    How to deal with the situation if you’re affected!

    Link showing the bad links in Google results:

    How we spotted this attack so early

    We have a tool that looks for negative SEO and this compiles a report every month warning us of new threats, in our latest report we saw many new links that were clearly designed to cause penalties.

    Our negative SEO tool takes a complete snapshot of a domains back links, and every month the tool will automatically take another snapshot of its current back links then any links that are missing are reported and any new links found are reported.

    So what are we going to do about this attack?

    As soon as we spotted these foreign domains in our report we then ran some other tools such as our Low Trust Tool which confirmed these links were in fact from banned & low trust sites and then the tool automatically uploaded those URLS to our link removal CMS which crawls the sites to gather all the contact details for these domains.

    Then we run our powerful link removal email tool which automatically emailed stern warnings to these sites demanding immediate take down of the content on every site in question.

    Then we used our other email tool which contacts the hosting companies. This automatically sends an email to each hosting company raising abuse tickets complaining about their behavior, at the same time we uploaded the disavow list to Google, remember this is actually what the disavow tool is designed for Negative SEO and not for hand choosing your old links which are now working against you. Our Link Removal CMS generates a text file in the correct format to upload to Google’s disavow tool.

    List of URL’s We Disavowed So far 323 URL’s viewed HERE

    What to do if you suspect a negative SEO attack!

    If you want to protect yourself from negative SEO from your competitors then it is imperative that any harmful links are found, removed and disavowed as quickly as possible (using our tools if you want to manage the task elegantly).

    If you want access to OUR NEGATIVE SEO TOOL please get in touch for more details.

    The Link Auditors

    Social signals and SEO debunked finally

    For about 4 years we’ve been telling clients that social signals have no effect on the search results. In that time I’ve heard many so called ‘SEO experts’ argue with me that they were having great success by using social signals to improve their search results. Matt Cutts finally spills the beans and tells all that social signals HAVE NO EFFECT ON SEARCH RESULTS.

    Predicting the beast


    Updated see bottom: following post by Matt cutts.

    Well a few weeks ago I put out an email forecasting that Google would be hitting Guest blog Posts and SEO companies that had been using on this technique. NOW as everyone will know one of the biggest proponents of this method has been Moz. Anytime you would have been to the Moz blogs in the last few years, you would have seen that every other article was about how you can use outreach and ‘content marketing’ otherwise known as ‘guest blogging’ to provide a white hat SEO solution for your clients. The fact is that Moz have been evangelizing this so called white hat method for years until the other day, when the brains behind the scenes Rand Fishkin (truly nice guy) has just exposed this white elephant in the room to all his followers:

    It’s ironic that after all these years where he’s been promoting this method, that he should just now come out and denounce the practice in this way (does he get my emails?). At the same time he recently stepped down as CEO from Moz because he wasn’t well equipped. Moz have made a decent service in the SEO world and their blog presence is one of their core marketing tools, but now they will have to come up with another ‘system’ to promote that will attract a new bread of subscribers!


    Google are the king, we know… they are our God, they our rain maker and we know we have to abide by their rules and pay them money if they catch us cheating their system. It’s a great business model and racket they run, but whats interesting is that there is a serious power law at the heart of their model, where 80% of their profits come from 20% of the SERPs. Point is there is a ton of money being made in a handful of sectors and thats making them enough money to pay for all the experiments and new business ideas that they explore:


    We’d all like to know where its all going, what is going to work from a business model in the next year, how we can best make money by exploiting some of the positions in the SERPs to bring in enough organic traffic to make a profit and avoid loosing it all to a penalty or update. But the bottom line is that while everyone has been looking for safe methods to use and have hoped and prayed that the bandwagon they have jumped on will continue unchecked, they are now just starting to realize the fact that what they thought was safe ground, is not.


    We accept the risks! We try to minimize the risks but we accept them. Fact is when you play against a Goliath you will take injuries and occasionally loose some blood, but to put your head in the sand and pretend to yourself that you are safe and able to avoid any hits in such a game is simply delusional. Most of the companies that have played in this game have experienced some loss along the journey but rather than adapting to that reality it seems lots prefer instead to try and please Google.

    You will never make everyone happy in business and you will never make Google happy, no matter what you do. Google don’t make their money by trying to make the companies that they rely on happy, instead Google actively go out of their way to inflict pain on companies close to them and then use any justification to demand payments, its an extortion racket and nothing more or less.


    There is something I see that many don’t, its the bigger picture, a picture of who survives and prospers and the strategies that they adopt prior to their fortunes being made or lost in this business. I’ve always used my predictive powers to choose the path we take with regards tools that I’ve developed and services that we offered. Using that power I’ve also predicted many changes that have happened in this business well before they happened, some examples were the density tool, that tool was launched years before the Penguin update which focused on over optimization and it highlighted where Penguin would catch clients with precision.

    Then of course there was all the other tools I launched over the years that focused on other types of problem links which were launched at a time when everyone still valued them types of links and they later turned out to be sour (panda).

    With all the tools I developed, they were done so at a time when they were considered novel but not essential, only later did they become seriously valuable and the atmosphere changed so that they looked to be fortuitous…

    The same is true for the email tool I developed that made contacting people to get toxic links removed a breeze which is still the best tool in this sector. This was developed at a time when everyone else was still evangelizing the disavow tool which has turned out to be useless in most cases.

    Note that I always said that Google would never let SEO companies hand select the the good links with the use of the disavow tool! Contrary to some other ‘experts’ who insisted they were using it with success even before it was actually launched:


    Then there was one of the most interesting changes of last year ‘Hummingbird’ that no one apparently even noticed.

    That said, 2 weeks before hummingbird was launched I predicted it with an eerily correct summary of what I though was going on and that email post you might remember was followed up with a “told you so” a day after Hummingbird was launched.


    My Crystal ball is telling me that Google are going to make the most aggressive changes to the SERPS this year, even though they will claim only a small percentage of companies will be affected.

    See the companies that are currently getting away with manipulating using ‘content marketing’ and guest blog posts including big players like who have been exposed recently:

    will start to see the effects and be caught out with serious surprises as has apparently happened every year since I’ve been involved in SEO. Thing is afterwards everyone says we didn’t know, but the writing is always on the wall.

    In the past google went after clearly spammy sites and every year changed and went after slightly more legitimate issues affecting the smaller more decent companies that could afford a bit of SEO, now they are creeping towards the bigger players who have legitimate sites and have adopted these so called ‘white hat’ techniques and had expensive SEO campaign strategies with so-called ‘top agencies’ who have charged them fortunes for content marketing along with a bit of ‘social media marketing’.


    If I was a betting man I would take a position and stick to it even when the writing wasn’t clearly written on the wall. For those who have been following my emails over the years, you would have seen the many emails where I mentioned the emergence of Bitcoin well before it was at 30 dollars and would have seen that these predictions have often appeared fortuitous after the facts had unraveled. In this world it is often said that someone is lucky. But true luck is often just a result of lots of hard work and its that which gives one the insights and inside knowledge that makes us all seem lucky.

    I have developed some of the best solutions in the SEO sector over the years and over the last 4-5 months I have done more work than I would have liked building out an honestly authentic system of SEO, I believe its the most advanced solution of its kind and over the next few months I will enjoy the fruits of that labour as we develop strategies for projects that we see are worthy of investment with partnerships that I believe are worth developing.

    Got such a project? then get in touch…

    Kind Regards
    Steve Carroll


    I’ve just seen this, its Matt Cutts calling guest blogs on the same day I sent my email predicting as much. Of course I hadn’t seen this post of this until after I wrote my post but eerie that it was being posted at the same time as my own article was going out…

    Told you So

    “This article was first published on our private email list on the 16th of September 2013, 2 weeks before Hummingbird was officially launched, I am posting it now on our blog for posterity and so I can link out to it in future articles”.

    There is a new algorithm that has just been named and it is claimed to
    be the biggest redesign of the algorithm since it was first launched
    in 2001. Its called Humminbird:
    “Why is it called Hummingbird?

    “Google told us the name come from being “precise and fast.”

    Now, if you look back to the email I put out on the 16th of September
    with the subject “another update and more confusion” you will see that
    I predicted a NEW algo had been launched and talked about what I
    noticed and many of the details I suggested we’re eerily 100% right.

    The following is printed again from that email I sent on the 16th

    Google Updates:

    Well It’s clear now G are playing with the dials and on the 12th,
    there was another mild update, what they are doing is somewhat of an
    enigma from the chatter out there but this week we’ve seen some new
    light on the situation.

    On the 4th September whatever they did seemed odd and possibly wrong,
    I.e. not one of their best or clearest updates, from then on it looks
    like G rolled that update back on the 12th, we’ve certainly seen
    results that show that clearly (one site we monitor was at the top
    along with 4 others – all exactly the same in link profile), then this
    one site vanished for no real
    reason and has now been brought back to its former positions.

    It seems if anything that one site was effected by some sensitivity to
    dupe content issues, but that is a bit vague, though that site did
    suffer a technical problems brought on by a silly web developer
    uploading dupe content briefly a few months ago until I noticed it.

    The other issue is two fold now there are all these “in depth articles
    taking up more real estate” and clearly there needs to be a big push
    to understanding what formula is being used to choose those selected,
    secondly there has certainly been another push in the middle between
    the EMD and BRANDS, with lower quality domains ‘brands’ seemingly
    being negatively effected, thogh its all link based, so companies that
    have worked hard on
    their brand citation in their link profile are doing better.
    Ive long been saying the break down should be like this:

    33% links BRAND,
    33% links UNIQUE ANCHORS
    33% links white noise, ‘click here’ etc.

    The other thing is the speed of changes from new links being found to
    actual changes in results.

    Maybe they have just done some deep crawls but we are noticing 2
    quicker events, one on the recoveries from penalties, previously in
    years gone by they were slower, now we sometimes see positive results
    in weeks from taking action (I’m not talking about doing
    reconsideration requests but that is quicker too) but I’m talking
    about removing toxic links, it seems if you can find anything toxic
    immediately you will not be affected by any negative positions. The
    other thing is speed at which they calculate Penguin (over
    optimization of anchor texts), it seems like this is becoming constant
    flux rather than a specific update rolled out every so often.

    They are being too quiet during this period also. This is unlike them
    in general, normally updates are called a bit later and confirmed but
    so far there is absolute silence from the beast. This indicates
    uncertainty in what they are playing with and not the typical updates
    that we are used too. Something new and exotic maybe that is being
    aimed at spammers but that they
    have not quite perfected yet so are testing live and will roll out
    officially in a few more weeks. I suspect something like that is
    happening and we haven’t seen the last of this ‘thing’ yet.

    I suspect this will be official in a few weeks and they will call
    something like 5% of the SERPs will be affected but when in reality
    its like 40% of the commercial terms being thought over which are
    being affected.

    Expedia Outed Google Goes Quiet, Somethings brewing!

    A pretty interesting post made the first page of Hacker News the other
    day regarding cheating using ‘guest blog posts’ to game Google. The
    guy who outed them is a disgruntled SEO who’s got a bee in his bonnet
    over penalties he received and is pissed that Google lets big
    companies use the dangerous ‘white hat’ technique of “guest

    It’s never nice to see people outed for using SEO but these are
    emotional events and at least he’s taken a shot at the big boys who
    are fair game? (it’s slippery at the top)…

    Well well, normally when these events occur, there is an immediate
    response from Google by way of Matt Cutts who hangs out on HN; when
    rapgenius were outed a few days ago on HN, Cutts was all over that
    story then shortly afterwards Rap genius was penalized..

    This post on HN got nearly 500 votes which is huge and its clearly
    caught everyone’s attention at Google. So whats up? Why silence on
    Expedia who are still ranking top for the terms they were outed on!

    Is the reason Google are thinking over something to do with Expedia or
    instead is it something to do with the practice of ‘guest blogging’ in
    general. When Google go quiet they are normally brewing up a storm,
    they don’t like to be embarrassed and they work hard to protect the
    perception that manipulating their SERPs is easy or possible or a good
    idea. Everyone usually gets whacked when they are outed.

    In this case id suspect Google are taking a better look at the
    problem, see they have now been challenged to a duel and we know they
    have been watching the gravity of SEO shifting towards guest blogging
    over the last few years. They have made warnings that this will be bad
    when it gets whacked.

    Maybe Google are having trouble working out what a guest post is
    algorithmically (in which case they might do it manually). Maybe there
    is a fine line which they cant detect in many cases or maybe they want
    the net to be totally full before they pull it in. This is my
    preferred answer, see Google love penalties, this makes their
    psychological war impact the largest number of offenders which also
    has the spin off effect of pushing the largest amount of fresh blood
    into Adwords.

    If they pulled this net in at the beginning a few years ago, the
    payoff would be much less, they generally wait until they have the
    biggest possible payoff. Every year they need something new to boost
    profits and keep the so called war on spam in a constant state of
    flux, ‘dampening the spirits of the remaining SEO offenders’ and
    therefore they need these loopholes to be open for long enough to get
    as many people addicted to their form of Crack as possible before Matt
    ‘cutts’ of their supply! Anyone then still addicted enough with any
    amount of cash will then spend their profits with Adwords and Google’s
    profits go even higher. Rinse and repeat!

    So whats going to happen next? If Google penalize 1)
    everyone will be outing others doing the same thing and that’s an
    avalanche waiting to happen, 2) are Google ready for this? 3) and more
    to the point they will have to clarify the status with guest blogs
    posts and have a penalty system in place to dish out to offenders.

    So whats the ‘guest blog post’ update / penalty going to be called?
    Chameleon? And when is Chameleon going to be launched?

    February would be my guess… remember where you heard it first!

    DiD Cringely Call Out Google as Robbers Unfairly?

    An interesting post and rant by Cringely who has been posting on his blog for many years and has caused quite some debate, talking about his sisters site ( ) that sells ‘pillow quilts’ that has recently been banned from Google.

    Interestingly the post made fairly big news and there’s been lots of debate with all the other search engines coming out to dig into Google too, Such as Blekkos boss chimed in as did someone from Bing both calling out the differences to their search results and customer relation experiences, that said unusually Matt Cutt’s didn’t get into this debate to protect Google’s reputation and has refrained so far on this occasion (least on hacker news where he hangs out a lot these days).

    Now its surely a hot subject but did Google call it correctly? Was there manipulation behind this sites ranks and if so is there anything we can learn from looking at their back link Audit?

    Well taking a quick look it seems like a standard case we often see with regards such actions. The site has a lot of links from poor quality directories and also suffers from over optimization of anchor texts, as well as direct attempts to manipulate the search potions with such links.

    Now Cringely thought he could help his sister by adding a few new links from his authority blog to her crippled site, using his basic SEO knowledge and anchor text tricks, that said he stuck a link at the top of his article with the main money term anchor text “Portrait Quilts” and there was no reason for that link, clearly what was going through his mind was he thought he might be able to give her a good link and help her recover her Google positions….

    Lets be CRYSTAL clear! You can’t get out of a Google penalty by adding more links and in this case he made even more trouble for her as he added yet another link for the already over optimized link text ‘Portrait Quilts’, bad idea Cringely, she already has 10% of her anchor text distribution for that term, it should not be higher than about 3-4% for any particular term. That’s unnatural and Google can see the manipulation as we all can.

    And you know what else Google can see? that the term “photo pillows” has over 23% of all the anchors the site has, that’s over optimization on a grand scale!

    If you look at the links, you will see some of the worst types of paid links with both blatant manipulation from mostly paid link directories! For example:

    In fact with so much manipulation going on I wouldn’t have called attention to this site at all, it will be a hard crawl back to the good books of Google and the first thing they will need to do is remove about 50% of the links as they look like blatant spam WHICH OUR tools call them out.

    So what does our Index Trust Tool think of the links?

    WOW 78% of this domains backlinks are in a bad neighborhood THAT’S WHAT! Not very clean ha? NO. Normal sites with no SEO manipulation tend to have less than about 20% of their links from a bad neighborhood!

    THE interesting thing is the site was using many links from the same IP which could have been easily detected but the Index Trust Tool found another 17 links to add to the report that were not already added for over optimization or already found as part of the FREE duplicate IP tool!

    OK A few more tools to run left, Our banned link tool (found another 8 links), sitewide tool and we have a complete list of bad links, which should be removed. The next thing Cringelys sister will have to do is get these bad links removed, now we’ve run all the tools free so far Cringely so you have all the data of these bad links now listed on our FREE LINK REMOVAL CMS here: (this is a part that helps you manage link removal projects):

    If you want to use our nifty tools to get the links removed all you have to now do is pay the £49 fee and sit back and let our automatic contact tools do all the work for you. They post bespoke messages to the contact forms of these sites cracking any captcha challenges when they see them and is the best way to get links removed elegantly and swiftly. They normally get more than 50% of the bad links removed and are a great to manage such problems elegantly and swiftly.

    If your suffering from any penalties or fear you are close to one by having had some questionable link practices in the past then you should start your own project on and see what we find wrong with your site!

    .Simple tips on Google Algorithms- How to Avoid Penalties

    Audit Your Links

    Google now endeavor to check a websites backlinks for any black hat techniques, one of which is webmasters acquiring poor quality backlinks to their domain, to achieve this Google have now developed several algorithms and they apply penalties to websites they feel have violated their guidelines.

    What Can Google Algorithms Find?

    – Websites having low quality content
    – Copied content from other sources
    – Multiplicative links
    – Paid links
    – Keyword stuffing and over optimization of anchor texts

    Similar Anchor Text in Backlinks

    It is essential you use anchor texts sensibly; you should not over optimize your backlinks no more then you would optimize you on-page keywords. Anything over 3-4% could be considered over optimized. You must not be adding keywords that are irrelevant to the content of the website, page, blog etc.

    Keyword Stuffing

    Keyword stuffing maybe something that has not been done purposefully by you though this could still be affected by Google’s Penguin filter. The on-page keyword density should be below 3-4% for any search terms.

    Original Content

    Google seeks to ensure the content of the website / blog is original, it checks for spun content, poor quality content etc. The chances are you will be impacted by Google’s Panda algorithm.

    Simple tips to ensure you don’t become a Google Panda or Penguin victim:

    – Blog posts set to over about 500 words may help improve quality
    – Make sure you link out naturally to other sources, these should not be the most common websites on the Internet I.e. the Wikipedia and Facebook and when linking out to other sources make sure that the pages you link to are relevant and that they are not commonly linked too, that way you will be adding value to the debate!
    – Keep the theme of the content throughout you site relevant to one topic
    – Don’t use SEO and avoid the top positions in Google’s natural search results *
    – Pay Google (using Adwords) for the privilege of all your traffic *
    – Avoid ‘guest blog posts’ and anyone who offers them to you.
    – Avoid ‘Info Graphics’ and everyone who offers them to you.
    – Avoid ‘Inbound Marketeers’, just as much as you would ‘SEO GURUS’
    – If you are selling products, advertise in Google shopping pages (small fees may apply).
    – Avoid web developers who don’t understand 301 redirects and 410 header codes.
    – If you have any SEO problems Tweet Matt Cutts or John Mu so they may ignore you.
    – If you get a warning in webmastertools, panic.
    – If you get a loss of traffic don’t ask your SEO expert why, just fire them.
    – If your SEO Guru tells you they don’t know why you’ve lost your traffic, it’s probably because of them.
    – There maybe value in these tips as well as some sarcasm too.
    – If someone else promises they can fix your Google Penalty, ask them how many false positives does it take to make a link audit. *
    – If you read on an authority SEO blog that you can fix your penalty by filtering low page rank links and disavowing them, make sure you Google plus one them.
    – If you read that you can simply disavow your links and your penalties will vanish, remember another option is to prey to Google too.

    All advice is Link Auditors own, if you are concerned about Google Penalties or require assistance please contact us for help today


    Since the launch of Penguin and over the last year it has become fashionable to conduct link audits on a domain with the goal to ascertain and remove the toxic and harmful links. Until Penguin the world at large was blissfully unaware that bad links would hurt ranks, that worldview was largely supported by Google who espoused the general propaganda that they simply ignored bad links, which nicely fitted the worldview of some rather gun-ho SEO ‘experts’ who couldn’t care less about the scary mess they created in their wake and the theory went like this, if bad links were harmful, anyone could harm your ranks with negative SEO, so it cant be true!

    Alas, when Google broke the spell they had cast upon the SEO world there was uproar, for company bosses started getting notifications telling them their SEO campaign manager was directly responsible for their domains being removed from the search results and heads started to roll. Then the whole negative SEO thing raised its head again and Google eventually had to address the issue down the line with their launch of the Disavow tool.

    This put the whole Link Audit concept at the forefront of the search industry and conducting Link Audits transgressed from once largely being considered a waste of time, into a valuable ‘new’ skill which commanded specialist knowledge and insight. However, there was still a few hurdles to overcome, as then came a little reluctant group of ‘wedontlikechange’ vocalists who spent their time bemoaning the whole link audit concept as an out of hand scam being promoted by a group of snake oil stalkers who were preying upon others grief (for which there was some truth).

    This is because there was indeed a bunch who were once largely responsible for acquiring and placing such links and they knew exactly where some of the problems lied because they had actually created them. They then jumped on this bandwagon and purported to offer link audit services and claimed to have such skills to analyze back link profiles and carefully find those toxic links, these thrill seekers worked in tandem with other dare devils who’d sell the whole ‘we can get your links removed for 5 pound a month’ crew.

    After some further ado it became clear that this was a bunch of entrepreneurial directory owners whom having recently been banned from Google, were no longer able to sell links, but realized there was a quick buck to be had by charging those whom were negatively affected to now remove each link they had once been paid to put up, this lot had now banded together to ‘help’ companies with this problem!

    The scariest of all these changes though has been the articles on the respectable blogs where they advise you on ‘HOW YOU CAN CARRY OUT A FULL BACK LINK AUDIT!‘, there’s been a ton of these over the last year and they seem to be copying and pasting from the first diatribe ever posted on the subject which royally painted all the wrong colors in the numbered boxes. If it was a straw house you were looking to build these surely are the architects for the job.

    Now I’ve got to say this, there are some who are qualified to discuss this subject with more weight than others, namely myself not least, and why? because I’ve been building link auditing tools for over 5 years, these tools I built had their origins in penalties that I was seeing in the field, where companies would fall out of the SERP’s ‘for no apparent reason’ and where the bosses were fed the impression there was some wild update at Google that was responsible, this was when the whole black art of SEO was often kept hidden from the business owners view and where SEO companies worked in a cloak and dagger environment often not revealing their wonderful work.

    When things went wrong in those days it was Google which got bad press, the losers would all hit the forums in droves and proclaim that Google’s days were numbered, that the search results were now so bad (with their site missing) that no one would use the super mega scary monopoly anymore. However, since that time the world has continued to flock to Google and the vast majority of SEO firms have seen their victims abandon them with as much flux as was seen in the search results by their former clients.

    Most former SEO companies have even changed industries, to flee the sinking ships in their wake, ONE now likes to refer to themselves as InBOUND Marketers, such is the shame associated with their former activities, what with the name change there has been a game change where them WHITE HATs have been dug out of the closets (again) and theres been some leapfrogging onto the now well known phrase GUEST BLOG POST and the infamous INFO GRAPHICs while everyone catches their breath for the next few months until Google plays catch up once again.

    Now no one likes to hit a man when he’s down, but where have all the experts gone? We might get the odd cheeky spammer putting the finger up to Cutts, and you might get the odd insightful whisper from Google but for the most part this is an industry filed with snake oil sellers who have little understanding of the Google algorithm NOR for the technical challenges Google are grappling with and as a result YOU are getting spoon fed yet more bolognese only this time its out of a different packet.

    There are great chasms between the reality on the ground and the belief systems within this industry, many of these beliefs are reinforced by Google who have for the most part managed to wage an extraordinary propaganda cue, pushing most of the self proclaimed ‘SEO experts’ in this field into some form of embarrassing submission.

    The essence of the story so far is this, SEO companies have been feeding Google with bolognese, Google swallowed it until they puked and when they puked all the chiefs were fired, then in order to make sense of all the puke Google told everyone in the puke to clean up the mess for themselves, which may or may not remove them from the puke. Many are reluctant to touch it and most don’t know the difference between puke and bolognese. Thats were we are right now. With some new ‘false profits’ coming along to tell you what is puke and what is not. But unfortunately this has to be said, its mostly the blind leading the blind out there.

    Take the latest article on one of the authority blogs ‘Search Engine Journal’ posted (June 21, 2013):

    This just annoyed me so much that I was compelled to write this article in response and clarify the nonsense which is commonly being spat out and accepted by these so called authority blogs:

    Now while I applaud the general awareness around this subject is growing, I have to call into question much of what is being suggested as best practice on that article and others like it when looking for bad or toxic links.

    Take the methods suggested in that article for identifying the bad links, they suggest removing any links which:

    1. Come from PR-n/a or PR0 sites
    2. Are site wide
    3. Come from very new domains
    4. Come from domains with little traffic
    5. Come from sites with identical C class
    6. Are from pages with a big number of external links

    Now lets break this down a bit and cast a skeptical eye over some of the Bolognese being written here!

    • Links that Come from PR-n/a or PR0 sites. REALLY – ARE YOU SURE?

    Fist off the bat, if you discount all those links which are on PR N/A or PR0 pages, you will be discriminating against around 90% of the pages on the Internet! I’m sorry, but just because a page is currently PR0 or PR N/A doesn’t mean its toxic or bad IN ANY WAY and when has anyone from Google ever said it was? In fact this is not the first time a so called authority has suggested such links are bad and this has been a common misconception which I’ve seen espoused in pretty much every major article on this subject in the last year.

    Moreover this is probably the quickest way to build a list of FALSE POSITIVES which will have a negative value and end up causing even more anguish, confusion and harm to a domains ranks if you used this as a signal in your LINK AUDIT and subsequent link removal campaign.

    AND THE REASON? It could simply be a newly found page (within the last 4 months) and hence has not yet been updated as part of Googles Pagerank updates (which take place roughly 4 times per year), or it may just be a page on a large site which doesn’t have enough Pagerank (Juice) to pass around all of its wonderful pages and thus is unable to send them pages into the so called SAFE ZONE of a PR1 or above…

    This is one of them myths that seriously needs to be put to bed and I welcome the day when someone asks one of the vocalists at Google for clarification or advice on this much lauded white elephant in the room. If you’ve been sold on this as a method to locate your bad links you will have wasted all following efforts in getting them links
    removed and no doubt done yourself a disservice in the process.

    And what about all the other RED flags raised by the so called experts?

    • Links that are site wide

    This is undoubtedly true and the only question left is to how to most effectively find such links from the data you have. These links are normally located in the side bar (blog roll) or footer of shitty domains often along side a bunch of other similarly smelly links all with clear money term anchor texts.

    • Links which come from very new domains. REALLY – HERE WE GO AGAIN – MORE BOLOGNESE!

    New domains typically have no Pagerank or Juice value and since new domains are sandboxed (Googles penalty on new domains which have not yet earned trust), Google sandbox outbound links (on new domains) in the same manor, hence if anyone tries to manipulate the SERPS with such techniques they fail. This is not a widespread problem that Google are still grappling with and hence singling out the new domains where one has a link from is just going to add more white noise to any data you end up with.

    • Links which come from domains with little traffic. OMG WHO COMES UP WITH THIS RUBBISH?

    This just shows a complete misunderstanding of firstly the nature of the Internet and secondly the problems that Google are grappling with. If Google was to use such a metric or signal for improving the SERPs they would end up with so much false data that they would be totally lost and have no idea what was an authority or why. Most articles have a spike of traffic when first published then that traffic dwindles down as it is buried away. When you build a module into an algorithm you want data which means something significant, not something that will result in pure white noise. There has NEVER been anything said or even suggested by Google that they use such a signal to lower ranks and in all my years in SEARCH I’ve never seen any evidence that such a concept is being used to qualify the value of links! Despite the beliefs to the contrary.

    You can put this one into the quack box along with another similar signal I’ve seen used to mark links as suspicious, one called Low Link Velocity, which is apparently when a domain or page is no longer attracting backlinks as quickly as it once did and hence is therefore likely to be a domain which has been sold to a link spammer and is now being used as a link farm. This again could never be used as a signal by Google to qualify links as many domains and pages attract a natural spike of back links when they are first published and interest generally diminishes as time goes by. That’s pretty normal and certainly not a signal which could be relied upon to qualify a link!

    • Links which come from sites with identical C class. FACT OR FICTION?

    In general there is some truth to this one, however, its not a simple case of black and white though it’s certainly a useful signal for Google to see whats connected during a manual inspection, certainly also once they have smelt a rat but it is most likely treated as a SOFT signal in the SERPs. They would most likely limit the Juice being passed from one IP to another and these links should most certainly be avoided as it would help them find networks. That said theres another problem which I’ve seen here, I have seen historical data being used in some link audit reports whereby the hosting was changed years ago and the IP data being supplied is historic and hence misleading. Its a good signal ONLY if the data is reliable and fresh!

    • Links from pages with a big number of external links. ANOTHER FALSE ASSUMPTION?

    While I’d tend to agree that if a page has lots of outbound links its most likely of low value, though in some instances there are pages that do contain value and have lots of outbound links. Typically where all the links are natural and relevant to each other and where the anchors were not money terms you could be safe to assume the page was worthy. So crudely treating all pages as bad based on the numbers of outbound links alone would undoubtedly raise false positives and there is no evidence to suggest Google is that crude.

    So where does that leave us? Pretty much in a mess based on some of these very unsound signals. Raising flags which are unsafe results in huge amounts of ‘possibly’ suspicious links which then have to be reviewed again manually which totally defeats the whole purpose of having a link audit in the first place!


    Firstly and most importantly you need confidence that any links which have been flagged to be removed have been flagged for valid reasons. Ironically Google is our friend in this task! We look for actual evidence and signals directly from Google which have their origins in past penalties I have analyzed and learned from. Specifically we check for:


    Prior to when Google launched Penguin I used to refer to this penalty as the BackLink Over Optimization Penalty or BLOOP, following a post Oct 13, 2011 on SERoundTable which named it, before that I used to just refer to it as Over Optimization penalty. I first saw such a penalty in late 2009 which was targeted to specific anchor texts that were over optimized and following an investigation and analyses of someone who was penalized, I then built our infamous DENSITY TOOL which I launched in early 2010, back then we considered any anchor texts with a greater density of over 20% to be over optimized. Since that time I have been steadily decreasing the tolerance threshold and now consider any anchor text density over 5% to be over optimized. This is pretty much in accordance with on page optimization (best to
    keep that below 3%) where Google dealt a blow to such sites who abused this as far back ago as 2005!


    These are your bad neighborhood links. To locate these we have to directly scrape Google and check each and every page where you have a link to see if it is indexed or not. Google are masters at indexing, normally they are greedy and will index every page they can find even if its poor quality, so when they decide to not index a page it is a clear statement and signal from them telling us that page was de indexed for bad behavior. These are ALWAYS pages with suspicious outbound links and in this case the links point directly to your target domain which they have discovered and don’t like! This tool has a long history and following Googles many wars in the past aimed at removing manipulative links I fist launched this in 2009 around the time that these attacks by Google started to become evident. It has been a long standing signal where we have seen bad neighborhood links have had a negative effect on ranks going back as fas as 2008 and possibly before that too.


    These are also bad neighborhood links, but different from the above. I am looking here specifically at the TRUST GOOGLE have placed in a domain. We can see that by looking at how well they have indexed the site, typically Google will index every page they find, but when they instead decide that most of a domains pages are not worthy of indexing we can see they have lost trust in that domain, we typically flag domains when more than 90% of their pages are not being indexed by Google. This UNIQUE TOOL (no one else has this) remains one of our most powerful tools and is fundamental to any link audit. It also digs out domains which have been totally banned. I first launched this in 2009 following a penalty I analyzed where most domains links were from low trust sites and this tool became one of our most important and trusted tools very early on.


    Following an analysis of domain which had a penalty back in June 2010 I noticed that site wide links were no longer having a positive effect, until that time these links had been much coveted by SEO people because Google often gave sites a real boost with them. It was always a dangerous strategy though as you could end up with hundreds of thousands of links with exact match anchor texts. That said it was common for web design companies to have these types of links as they often stuck a sneaky link at the bottom of every page on sites they made for clients. Those web design types of domains often still get away with these links if there for the brand only, but everyone else wants to avoid Sitewide links like the plague. One of the early problems back then was working out how to find such links as there was not any easy data sources and hence I built our Sitewide tool. This tool has been through many revisions over the years as certain techniques stopped working and data centers closed.


    Rather than classify everything on the same C class as toxic, I specifically filter out links from the same IP only. I am looking for clear signals which we know Google use and I want to avoid white noise which is speculative.


    When we carry out back link audits there are two on site checks we also look at, one is looking at internal URLs on a target site to make sure there are no issues with the wrong types of redirects being used, that often results in no juice being passed around the site, this is called our Unclaimed Juice Finder, the other check is to make sure all the URLS on a target site are being indexed correctly, occasionally this flags up weird issues where Google have decided to remove pages they consider were created to capture traffic only.


    When we carry out link audits the last thing we want to do is pass over data to our clients which they cannot easily action upon and simply disavowing your bad links is NOT ENOUGH to get you out of a penalty, despite what some people would like to believe! Google have continuously stated they want you to make every effort to remove any manipulative links before you try to use the Disavow Tool. We have also found that removing harmful links results in better successes when dealing with both manual reconsideration requests and algorithmic penalties. But getting links removed is a hard and arduous challenge, resulting in you having to contact everyone who links to you from these bad resources. This demands that you find their contact information and sometimes the only form of communication is through an online contact form making the process extremely painful if all this has to be done manually.


    We have a LINK REMOVAL CMS which is the perfect place to manage your link removal projects, this tool automatically hunts out all email addresses associated with a domain whether there’s an email address hidden away somewhere on the site or tucked away in the whois information. Our LINK REMOVAL CMS also detects all the contact form URLS associated with these domains.


    There are some other services out there which claim they can email people automatically on your behalf requesting the links to be removed, however, they ONLY look at the whois information for email addresses and miss all email addresses actually on the sites. Obviously this is suboptimal and guaranteed to leave most of the sites uncontacted. To do this difficult task properly you need to not only contact EVERY email address associated with a domain but MOST IMPORTANTLY it is imperative that the contact forms are posted too also.

    To automatically post messages though online connect forms is one of the toughest technical challenges I have ever undertaken to solve. To build a tool which can not only automatically email these domains requesting the links to be removed, but the also POST to contact forms where you are often faced with Captcha challenges to solve and have to make sense of arbitrary coding conventions whereby you have to deal with odd mistakes and idiosyncratic forms which have all manor of weird quirks, now that was a HUGE challenge. It took me almost 3 months of uphill battle to build this tool and it is totally unique in this space. It works amazingly well and coupled with a very powerfully written email which has an extremely good success rate at removing links. The best part of this being all automated now is we can send 3-4 follow up messages with no hassle at all.

    I also have another powerful tool which is used generally after a few weeks of trying to contact the domain holders themselves, though this tool emails the hosting companies demanding DMCA takedowns of content hosted by them, this also provides an effective and immediate response and together with all the other techniques listed above helps to ensure EVERY effort has been undertaken when locating and removing toxic links.

    Any links which are still not removed after a few weeks of trying are then disavowed, in our link removal CMS there is a button which will allow you to download a complete list of links that need to be disavowed which includes comments explaining the number of times you have contacted the companies or where no contact information is found.


    On one hand we have come a long way in the Link Audit industry which has developed gradually over the years but we have to be mindful and avoid the mistakes of the past where ignorance and mistaken belief systems resulted in a poor service and industry apathy. These were the same ailments which backfired in the SEO industry and forced those companies to sidestep their way into a reincarnation after the legacy they created behind themselves imploded.

    We are living in a PENALTY STRICKEN world and companies are in the midst of mine field where the wrong moves have resulted in many loosing their businesses and livelihoods. There are NOW clear messages being sent from GOOGLE and that message is loud and clear. You will suffer great loss if you mismanage your back links. You need to constantly monitor your back link profile and have an expert understanding of what is out there linking back to you. You cannot allow SEO companies to work clandestinely on your behalf, transparency and risk awareness are the most important qualities you should find in your advisors.

    With regards link audits you need to obtain clear actionable data and have access to the best tools out there for the job. These tools are bespoke and have their origins in historical penalties which have been analyzed and which inspired the tools I later developed. Don’t be misled by false profits whom have jumped on this bandwagon but who lack the knowledge and clarity that is fundamentally required when carrying out such an important and technically challenging task!


    As I was writing this article Julie Joyce from Search Engine Land was writing an article (published 25 June 2013) about the poor quality of links flagged when using other link auditing tools from two companies in this space, Link Detox and Link Risk:

    I personally did not want to call out any other competitors in this field and point out their results were erroneous as obviously that would be personal and biased. Instead I reacted to their published philosophy of detecting poor links (as the SEJournal article was actually written by one of the Link Detox crew).

    However, now that the cat is out of the bag and someone has published a detrimental article about their LINK AUDITING QUALITY, this has cast a grey cloud over the industry in general and I have an obligation to protect the reputation of the sector I have been developing in for the last five years.

    I have seen this happen in many industries I have been involved in the past where groundbreaking concepts would be developed by one company and others would quickly jump on the bandwagon and copy or try to emulate the features of the industry leader, but in doing so produce totally inferior products and services which ultimately are cheaper and better marketed, then when the general populace try these inferior products or services, often for the first time, are disheartened or disillusioned about the whole industry and revolted by the entire concept or business model / features.

    This is precisely what has happened in the SEO industry where now many companies are revolted by the puke their former SEO companies created for them and are stuck in the midst of these nightmare scenarios whereby they are being forced to clean up the mess these cowboys created.

    The whole SEO industry has pretty much imploded as a result of bad logic, gun ho – high risk behavior, false assumptions, poor companies and a general apathy towards providing quality services in this sector. These are the same issues we are now facing in the link auditing industry which are being exposed and it needs to be made clear that NOT ALL COMPANIES ARE THE SAME! I may not have been as vocal as our counterparts in the past, but that is because I have been too busy developing groundbreaking technology in this space and not just happily selling services that were knowingly based on fuzzy logic, fraught with misinformation and highly risky to the health of those using them.

    This article was written by Steve Carol who is the lead developer at The Link Auditors, he has over 10 years experience coding and has been developing solutions in the SEARCH space since 2008.

    To Disavow Or Not To Disavow, That is the question!

    There has been some debate over whether you should simply disavow bad links or instead go to extra lengths to ACTUALLY remove toxic links?

    Google’s gunslingers Matt Cutts and John Mu and have been pretty clear over the last year suggesting that you should make every effort to actually remove bad links and not just rely on the disavow tool!

    The problem is they haven’t been crystal clear, in fact they get really ambiguous around the subject of how the disavow tool works in as much as: Does it ALWAYS work in the same way as ACTUALLY removing bad links, Or, are there some cases where simply using the disavow tool might not work and your better off actually removing them toxic links?

    Well thankfully John Mu has recently made two videos which talk about exactly these issues and throws some light on the specific reasons why theres ambiguity around this subject:

    Lets take a look at this video where John Mu goes into some more detail:


    Q) What is a safe timetable for google webmaster tools to get updated for example the link data when a massive cleanup function was performed how long should we wait for that data to get updated?

    John Mu “ERMm thats hard to say, especially with regards to link data we actually have to crawl both sides of the link ? before we can include that in webmaster tools so if thats a link on a page thats really rarely crawled then it might be several months before we actually crawl that link and discover that its removed or changed and reflect that in our algorithms so thats something where you might see delays from a couple of weeks to all the way up to half a year or longer in some cases, its hard to say thats a specific time frame where you can actually see these changes , because especially when your removing a lot of links form a lot of pages that might include some links from home pages that we might crawl every couple of days and it might include links from other pages on the same website that we might crawl maybe once every 4-5 months so its not really the case where you’ll see one specific time frame where suddenly all the links are kind of shifted to the new way that the web is, its something thats moire gradual that takes place over time.

    Q) if the links are really spam, on forum pages from 2008 when nothing changed on that page, so its safe to assume that we should wait for 6 months for the links to be showing on your side that they are removed right?

    John Mu “I can defiantly see cases where it can take 6 months or longer even for that to be reflected in webmastertools but, especially when it comes to problematic links that your cleaning up what you can also do is submit a disavow file, for the moment, so when someone manually checks that they see that these links are also gone, from the pages itself and the disavow file, then we can takes those out of the equation for any manual changes.

    Q) that means its safe if we remove 1k links to also include them in the disavow tool?

    John Mu “ERM, its always safe to do that but usually, but, er, usually, so, let me just roll back a bit, so generally what happens with the disavow tool is that when we crawl that page again and we find that link is also in the disavow file we’ll drop that link to your site at the time that we crawl it. So from the algorithmic POV it doesn’t change much, from a manual POV it can make it easier when they see this is also in the disavow file. so that makes it a little bit easier from the manual point of view, especially if your looking at something thats purely based on algorithmic actions thats not going to really change much.”


    Thats pretty clear. In cases where the link is simply disavowed they will ONLY remove it from the web once them pages have been actually crawled and they see that the links are actually gone!

    So whats the point of the disavow tool? it certainly helps with manual penalties and specifically when dealing with Negative SEO (which was one of the main reasons it was launched) but the fact remains there are really different cases of how it is being used and the messages coming out of Google suggest that EVERY EFFORT should still be taken to actually remove bad links…

    Take this question and answer it starts here at 3m16s in but the juicy part is 5m20s:

    Q If you have a lot of links that you DISAVOW that you are hit for by those thousand links, if all other circumstances are equally? Is there a negative thing hanging around your neck.

    John MU) “generally if you leave these issues there and your aware of them, thats something that could potential cause problems further down the line, so its not the case that you can say oh well these 1k links are already ignored by google’s algorithm so ill just focus on working to have users link maybe to 100 more times to my website and that might even that out again, THATS NOT SOMETHING THAT WORKS on that level, ,so if your aware of a problem I’d always try to clean that up as much as possible so that you don’t have to worry about whether or not this becomes a bigger problem in the future or whether if not its already handled like it is. So as much as possible I’d really try and clean that up, using the disavow tool, having those links removed completely is a great way to kind of handle that but it’s really something where I wouldn’t kind of assume that there are always handled in a specific way, and I wouldn’t assume that if you leave them there, these problems, that it will only be a problem for those specific issues there and it wont cause any problems further down the line with the rest of your web site, so its a tricky situation in that theres no clear yes or no answer, because every website is different of course and its always something where if your aware of a problem I would really work to have that cleaned up as much as possible”


    It apparent from the answers here that:

    1) not all sites are treated the equally with regards the disavow tool, so according to the site Google are looking at, they will treat the disavow files differently.

    2) that even if someone disavows links and a specific penalty is removed, that there could still be other issues caused by them disavowed links with other types of penalties and also possible down the line!

    SO WHAT DO WE DO? We ALWAYS remove all toxic links we find (not just disavow them), and we make every effort to do so!

    This tedious task used to be very difficult to manage until we built our AUTOMATIC LINK REMOVAL TOOL which not only emails companies but also POSTS TO CONTACT FORMS and even solves CAPTCHA challenges ensuring that our link take down requests get seen.

    With this tool it becomes easy to hassle companies into taking action, along with this we have another very powerful tool which AUTOMATICALLY CONTACTS THE HOSTING companies demanding action because of copyright issues, this is also a very powerful way to get action taken and our tools make these hard tasks easily manageable.

    Our automatic link removal tools combined with our other dedicated tools for locating specific links which causes penalties make us the clear industry leader in our sector!

    Guest Blogs Under Scrutiny

    There’s been some recent noise about Guest Blogs from Google’s Matt and John Mu:

    I’ve been warning for 2 years now that Guest Blogs and info graphics would turn out to be the directories of future, and while many have jumped on that bandwagon believing this to be ‘White hat’, they are going to be in for a shock when they read the latest from the two Gunslingers at Google:




    HN Banned From Google

    There’s been talk this week that Google are doing a rolling update which should have finished by the 4th of July, independence day (and all that) well, its been and gone and there’s still a bunch of churn & burn sites dominating the results.  Not least there’s even more of them exploited .tripod domains which still seem to be as prevalent as the good old Wikipedia. Basically its been a mess for a while, but come on it’s summer and it’s hot in Palo Alto so we can’t be expecting too much from the interns at the Plex who ate too many ice creams in the Google cafe. That said, WTF? you just kicked the fucking baby out with the bathwater.


    Just for reference I’m in the UK and these 2 search terms have brought up the site consistently in the top 5 results over the last 4 years.

    Hacker News is :

    1. one of the top 500 most visited sites in the world (according to Alexa)
    2. one of the most influential start up incubators in the world
    3. probably one of the biggest pains in the arse for Google in terms of when things go wrong they are often exposed on HN with pretty damaging public relations issues.
    4. my favorite destination for news (I’m a bit of a fan)
    5. one of the most influential early adopter sites in the world
    6. that’s enough accolade…

    Anyway if you’ve not heard of it before you’re unlikely to ever find it now, basically it’s been pretty much de-listed by the dreaded maker of dreams and controller of the worlds moral high-ground. Yes that’s right, we have a war in progress, where one of the most influential sites in the world (among the hacking community) has now falling fowl of the megalithic Google and yet somehow in due course there will be a winner. But who?

    Well lets put it this way, this wont be good for Google’s public relations as they try desperately to improve the search results, it looks like in the last few months the dials have been turned up so far, (in attempt to tackle manipulation) that good old fashioned sites which have done everything ‘right’ are now somehow most likely being caught in the crossfire, with devastating results for those who have dared to build a business based on Google traffic.

    I’m not sure what will transpire from here, but these are some basic facts:

    1. HN has never undertaken any sort of manipulation of the results,
    2. Google are clearly penalizing them from the results at this time,
    3. HN have a history of causing Google some embarrassment when things go wrong and this looks vindictive.
    4. this is not going to end well, for Google, as clearly they are the aggressor who cannot hide behind ‘the algorithm is to blame’.
    5. one has to beg the question, if Google have made some error, how many others are being unfairly affected by such collateral damage?

    In truth Google’s war on manipulation has for the most part got the picture right, they generally do correctly remove sites which have attempted to manipulate the results and done so using suspect techniques. That said somethings really wrong with the results when one of the most trusted sites on the net gets whacked for no apparent reason!




    Our visit to Google “Town” – Literally its bigger than most towns you come across.

    Percentages of Brand Links, white noise, keywords

    Something we have covered in the past a few times before is the percentages of links you have for brand links, white noise, and keywords.

    What we see clearly working best in practice is to have a fairly high portion of links for the brand. This should account for roughly 30-40 percent of all your links such as:

    1. brand

    Then white noise links with anchors should account for about 20 percent of your links such as:

    1. click here
    2. more info.
    3. more
    4. great site
    5. spassing is great
    6. this (or that)

    Then actual combinations of keywords links should account for the rest but we like to have a rule which suggests that you never use the same anchor text twice! so a long list of unique anchors.

    1. green widgets
    2. red widgets
    3. pink widgets
    4. round widgets
    5. square widgets
    6. square widgets with round corners
    7. blue boxes placed in a widget

    To see your link density you can use our tool on your tools page “Keyword Density Tool”. This is one of the most powerful tools on the market to analyze issues with penguin and other over optimization issues.

    Kind Regards


    Geotargeting Locations and TLD

    There was an interesting post the other day on the Google forums mentioning that Google are now treating the cc.TLD .io as an international TLD.

    Google treat some cc.tld as international meaning they they can be geo targeted to global locations. Therefore you could have a .io for example ranking in the UK results etc. The complete list of these cc.tld that they treat as international are laid out here:

    Another interesting tool to use when wanting to look at the Google results in different territories without having to go through proxies etc. is this geo targeting tool, where you can set the location and language you want to see the results in those locations:

    There was recently a post on SEland covering everything this topic has to talk about in more detail (though they missed the .io thing along with other cc.tlds and must be publishing from an old list so don’t use their list use the link above if looking for useful cc.tlds), and they also missed this useful geo targeted search results tool above 🙂

    Kind Regards