Part of understanding Blackhat and Hacker Spam is to put yourself it their mindset. And that means asking “How Can I Use This For My Benefit?”

Well I have been doing a lot of that lately. And one of the genius ideas that I had was full site takeovers. Then I toned it down. And then I thought of smartening it up. And then I thought of scaling it. The result? Well using a simple flaw in WordPress Plugins, and some clever strategy, I could sit back and game 100,000’s of relevant links to my sites, and control them all from one place.


Do This and Face Some Jailtime

Do This and Face Some Jailtime (source http://www.flickr.com/photos/morganmorgan)

NOTE: This idea is theoretical, and I HAVE NOT actually developed it any further than having a conversation with a few people, including Joost de Valk, regarding the possibility. All my research indicates, and Yoast confirms it, that it is indeed possible to create something this devilish.  There are other versions, of this idea floating around, which I will cover as well.

Side Note: This is HIGHLY Illegal. Don’t freaking do it.  In fact I am doing blackhat a disservice, this is downright Exploit and Hacking.

While back Yoast warned about a really dodgy SEO plugin being pimped out called Blogpress SEO Plugin. He found that amongst other devious things:

Next to that, the plugin is kind of enough to add a link back to itself on your blog’s homepage, in a hidden div of course, because that’s how smart people roll, right? Luckily, that makes it even easier for Google to find all the sites running the plugin and ban them all in one big sweep.

That is the level of control that you could unleash when you install a third party plug-in to your site.

There are a few things to remember:

  • Apart from basic security checks and looking for dodgy programming scripts such as the use of Base 64, not much else is done in the way of security checks.
  • WordPress plugins that are not hosted on worpress plugin depository, aren’t put through ANY paces, this includes “Free” and “Paid” plugins.
  • There isn’t a verification program for plugins for legitimacy and safety, and anything you install is at your own risk.
  • Just because you have used a plugin for years, doesn’t mean that a new update is automatically safe to install.
  • Once a plugin is installed, It can do ANYTHING to your site. Remember the Blogpress SEO example above.


Link To Me! (source http://www.flickr.com/photos/hoyvinmayvin/4116728906/)

Link To Me! (source http://www.flickr.com/photos/hoyvinmayvin/4116728906/)

So I questioned myself on what I wanted to do. Here I am talking to myself:

  • What we want: Links
  • What type of Links: Thematic, fresh, blog links.
  • Are they detectible? Probably, and am guessing easily.
  • Boo. So what else do we want? Loads of Thematic 301 redirects.
  • Are they detectible? Probably, not too easy to spot all the time, but still.
  • Crap, so what else can I use that isn’t detectible? Rel Canonical.  Simple command, looks fine to the user, but means a lot to search engines. Users hardly check their code once it’s live as long as page behaves normally.

Note: I have intentionally added a massive flaw to this methodology to trip up people who do ass hat SEO. Learn all search engine directives, what they were meant to do, and what they are capable off. I refuse to elaborate anymore, except that the method I am highlighting is very limited in its effect if you haven’t got a clue what I am talking about in this note.

If I can create a WordPress Plugin that is really popular, and gets installed on thousands of sites, then by simply inserting a backend piece of code, I can control anything on their site.

Easy. Run PPC Ads. Get it ranking in SEO. Pay people to install it, pay bloggers to review it. Make it a really cool plug-in.

Run a search of the WordPress Plugin Directory. Find old plugins no longer updated.  Select those that have high installs. Offer to buy from original Author. Boom.


Wordpress  Security Flaw Plugin Dashboard

WordPress Security Flaw Plugin Dashboard

So above is my theoretical tool management centre.  What does it do?

  • Lists all sites that have the plugin installed
  • Lets me pick which site I want to build links for
  • Finds all tag pages that aren’t blocked by meta robot or robots.txt per site
  • Themes the site by looking at over all keyword density.
  • Lets you pick all the tag pages you want by selection
  • Highlights when the last post under that tag was
  • Let you either pick a rel Canon or a 301 destination for that page.
  • One button push :)

The result could be devastation on all the poor unsuspecting blog owners, or may not even register on their radar! All you are doing is manipulating their tag pages… If you rel Canon them, most blog owners won’t even know anything is wrong except they may lose traffic to their tag pages from the SERPs.

Unlike ordinary category and actual post pages, I have found that these are the least monitored WP pages. Also, for people who love tagging, you can end up with hundreds of variations, hence more pages per site. And apart from those people who have either decent SEO knowledge, or a decent SEO Plugin, tag pages are often left open to search engines.

However this exploit can be used for any part of the site, including actual post page as well as the site home!

Security Flaw - Everyone Has One.

Security Flaw – Everyone Has One.

Is this a flaw? Yes. Can it happen to you? Yes.

WordPress or at least people passionate about it need to find a way to work on verification of plugins and maybe create a “Trust Worthy” verification for plugin release.  Till then, I will rely on my network of people such as Yoast, or freelancer WP Plug-in devs or my Birmingham based Development team who I have worked with for years to test and check anything I install on my sites.

  • Injecting Malware.
  • Cookie Stuffing.
  • Webmaster Tools Takeover.
  • Sale hijacks.
  • Password and Login Hijacks.


Wordpress Condom - Source: http://www.flickr.com/photos/nbachiyski/1463351154/

WordPress Condom – Source: http://www.flickr.com/photos/nbachiyski/1463351154/

Don’t install any old crap you find on the internet.  Check it. If you suspect a dodgy plugin, contact Yoast who will run a WordPress Plugin Review.

There are many exploits for sites, see this article for instance (by Kristine Schachinger).

Dynamic Meta Descriptions For SEO

I often say that you need to try new things, test new theories, play with the SERPs as often as you can. After all, if you blindly follow what others, and don’t try your own experiments, you won’t be a competitive SEO.  At the same time, read others experiments, learn from them, but try and replicate your own.

For example, Shark SEO posted about Multiple meta descriptions. This is a technique others in SEO, including old school SEO superstars such as Greg Boser and Dave Naylor have been playing with meta descriptions for years.  Greg has even advised up to 5 meta descriptions on a page, if I recall our twitter conversation correctly (see this tweet from Joost) , while Dave has been playing with Snippet optimsation for as long as I can remember.

Simple: Show a different, more relevant meta description for a page for different queries. AKA Dynamic Meta Descriptions!

Why? To improve your click through rates of course!

In a nutshell, I don’t take anything as gospel – I try and test as much as I can. In this instance, I did, and it works!

The below is a search for my Brand: Explicitly Me.

Explicitly Me Meta Description

Explicitly Me Meta Description

See that meta description?

Explicitly Me is an experiment on exposing weakneses in Google, Bing & Yahoo. Visit the site at Explicitly.me to get awesome Blackhat Tips.

Now let’s run a search for my name:  Rishi Lakhani

Rishi Lakhani Meta Description

Rishi Lakhani Meta Description

Rishi Lakhani (rishil) is a specialist SEM consultant, working on Paid, Organic, Affiliate and Social Media. To find out more, feel free to get in …

How cool is that? Its picking this up again from my meta description, and serving the right one for the query.

Here is a Screen shot of my meta description, but if you don’t believe me, check it yourself:

meta description snippet

meta description snippet

Now that I have validated that you can in fact have dynamic meta descriptions, I feel justified in deploying them for a number of sites and clients. Some example situations in which I would deploy these:

1. Brands Home Page – Brands home pages tend to rank for all sorts of stuff, from brand name, to top level generics. How cool would it be to have the home page meta showing a brand message when a brand query is entered, and showing a generic offer led copy when  a generic keyword is searched for? You then please both the Brand Police, and your SEO CTR requirements.

2. Top ranking generic pages – although I need to test this a bit more, I also think you can optimise the copy to show smaller variations in the description tag, for example, when a Car insurance page ranks for  both, Cheap Car Insurance, and Car Insurance Comparison – traditionally you would optimise ONE meta description for that page to include both keywords. But how cool would it be to show TWO separate ones which are query dependant? E.g.:

Car insurance comparison desc

Car insurance comparison desc

Cheap Car Insurance Desc

Cheap Car Insurance Desc

Read your peers, keep an eye on what they are trying and testing. See their results, then try and replicate as many as possible, as long as the results they get are something you can use.

PPC As An SEO Research Tool

Paid search gives you many options to test and play with the SERPs and Keywords in a way that SEO cant. I have used PPC on several occasions to make decisions around SEO, and quite successfully. It is often hard for businesses to understand the “value” of SEO, and I know for a fact that most big brands use Paid Search as a medium more extensively than Organic Search. The budgets for SEO are often a fraction of those allocated to PC. Yet businesses are still to understand that Paid Search budgets have a “one hit” shelf life, while SEO ROI is for much longer periods.

In financially motivated businesses often SEO is undervalued when the SEO teams fail to show profit / ROI centric nature of organic search, and I often end up reaching the benefits of looking at SEO from the lens of a Paid Search marketer.  SEO should not focus purely on traffic potential of keywords, but on the profit potential of keywords. In the past several studies have shown that PPC gets more budget than SEO.

Google Sprite

Source: http://www.flickr.com/photos/pandaray/2576981899/sizes/m/in/photostream/


Some of the areas that you can use Paid Search for include:

Your Meta Description and Title tags are important from a User Click Through point of view. Using PPC copy tests you can determine what messages get better click throughs from the SERPs – especially testing variant permanent Title Appends.  For example I found that a brand that used “official” in their PPC copy gained a CTR upwards of 5% better than previous.

So we switched the PPC messages from the “brand keywords” to more sales oriented copy, while switching the home page title tag to include the words “official site”. Organic SEO from branded keywords grew, while PPC traffic dropped, and the joint traffic improved.

Results? More free traffic for their branded keywords.

Photo Credit: http://www.flickr.com/photos/zrahen/4392138128/

Photo Credit: http://www.flickr.com/photos/zrahen/4392138128/

It’s not always easy to determine which long tail or three word / four word phrases to target, ideally you want them all, but sometimes time and budget limitations won’t let you target all the variants. In my experience any keyword can have between 10-50 variations, especially if you start looking at adding location modifiers to make up phrases.

How do you determine the rate at which to attack certain sub sets of keywords in terms of optimization priority? I use PPC to help me.

Create lists and ads for your target KWs and run them for a decent time frame – for high volume phrases (between 1000-10000 a day) I am usually happy to use a two week timeframe.  Let those keywords float in ideal positions (1 and 2) on PPC.  You will often find that the site converts better on some, while not on others. There could be many reasons for this – but at least you will find the most profitable key phrases in any subset to determine your rate of optimization.

PRO TIP: You can use this technique for brand new key words that the business never considered to see if they work – you often find what I call “Eureka” keywords that the site never thought of, but become very profitable.

Photo Credit: http://www.flickr.com/photos/omegaforest/4927169808/

Photo Credit: http://www.flickr.com/photos/omegaforest/4927169808/

This exercise is similar to the one above, but in this instance you can use C to determine ROI and budgets. Now I am not unilaterally saying that PPC and SEO ROIs are interchangeable – they aren’t. But you can often work out some sort of a standard variant where ROI of PPC=(x)SEO (on SERP #y). This sort of formulaic working from historical data could help you determine what to expect when certain keyword sets move up in organic results.

PPC has a better conversion rate. It’s true that PPC converts slightly higher than SEO in a keyword-to-keyword comparison. But 88% of the traffic comes from SEO and 12% from PPC. Rand Fishkin in his presentation PPC vs SEO

Cross referencing the spend on PPC against those keywords will enable you to work out stronger arguments for determining the potential uplift of sales by targeting certain SERP positions for these keywords. This in return allows you to work out the profitability, and hence desirability of those keyword sets – and in return allows you to allocate more budget towards acquiring those positions.

In a ROI centric business this sort of methodology is essential in working out SEO Budgets and actually increasing the value of SEO visibly to businesses.

Almost half of UK companies (49%) are now spending at least £50,000 a year on paid search marketing, up from 45% last year and 39% in 2008, while there has been a significant decrease in the proportion of responding companies who spend less than £5,000 a year on paid search, from 25% last year to 14% this year. Econsultancy Search Marketing Benchmark Report

I have covered some of this over at SEOmoz on SEO Budgeting.

Of course none of this looks at the long term Branding value of generic searches and determining longer term ROI time frames for generic Keywords:

Only 30% of purchases driven by non-branded Internet searches occur within the same online session when consumers conduct an initial search, according to a study by research firm Compete Inc. (source)

Source: http://www.flickr.com/photos/jadendave/5210820423/

Source: http://www.flickr.com/photos/jadendave/5210820423/

As I have mentioned above, SEO should focus on profitability – and some of the key aspects of profitability are where you land your site visitors on. For Keyword “X” should you land on the “home page”, “category page” or a “custom built to purpose landing page”?

In paid search, you can test all three variants. In SEO you may have the potential to rank only one over the other. By the time you realize you have targeted the wrong landing page, it may be too late or taken too long to switch. This can be a costly mistake, and short of conversion optimization on the target SEO page, you can’t do much in the short term.

I would suggest using PPC testing to determining the best converting pages per keyword sets, you would often be amazed by the results. In one instance I found that the home page is the best converting page – regardless of what you did for a particular generic keyword. In other situations, when we targeted  a landing page built for  “KW” and “KW+Local Modifier”, where we were targeting both the top keyword and the local area – we found that for the total sets of keywords, it converts worse  than the page that WASN’T optimized with page elements for the local modifier. (i.e No mention of the “local” aspect on the page actually helped the conversion, but made SEO for full range of long tail much more difficult). In the end we had to split it into two pages – one targeting “KW” and another targeting “KW+Local Modifier”.

  • I am not a great believer in PPC+SEO=More (Synergy), but there have been cases of this being reported, and reported often enough to test.
  • Better Coverage of the PPC & SEO Square off in the Battle of the SEM Heavyweights – SES Chicago 2010
  • PPC Blogs – check out David Szetela’s BIG LIST of decent PPC blogs on his blog roll. (you should follow him on twitter too.

When I asked Twitterverse the question “Q: Do SEOs use PPC as a research tool?” – I was encouraged by the answers, which proved that I wasn’t the only one using the techniques above:

  • staceycav Sometimes…. or at least conversion data from PPC can be useful in assessing potential keywords for a SEO campaign.
  • Chris_Dugdale Yes. Always (where possible).
  • mmhemani Why not, usually people use more then one tool for keyword research and i think using PPC is also a good idea.
  • TomNashUK A: they should do
  • justinparks yes. It helps quickly (and sometimes expensively) answer questions.
  • firstconversion Id like to more, its not very popular with clients tho :P they dont like paying for exploratory things
  • seoforumsorg yes – its the single most valuable source of information to SEO’s.
  • inkodeR I’d hope so – most effective method for KW research available. Can’t trust data from KW tools.
  • Searchmetrics Absolutely – knowing PPC data is important for understanding the competitive nature of a keyword for organic ranking
  • _robh_ if they don’t they should :)
  • marccclevy when I can afford it! Definitely v.useful.
  • LordManley A: Yes. Perfect for honing meta descriptions and titles to maximise CTR, for example.
  • gfiorelli1 I do, especially to understand the real value of some middle to long tail kwds that has “officially” no data statistics.
  • SamuelCrocker A: When data is available/when I can make it so :)
  • tatiana_london i think it is still used, to find keywords with good ratio of conversion/volume, with less competitive SEO then high volume
  • rishi3211us i think most would do.. there’s so much of valuable & proven keyword data in there
  • TonyVerre A: YES.
  • yrewol yes, and if they don’t, they’re missing out on a lot of the “optimisation” bits of SEO
  • PrachiDeshpande I use it as a research tool. Specially, to target best performing PPC keywords for SEO campaign.
  • NeilTompkins Yes and no. I normally use PPC to get an idea of how competitive the phrases are.
  • olivier_amar For sure. It’s a great tool to see which keywords convert and bounce.
  • David_TappYes, as long as we can get the information off of the PPC agency.
  • netmeg I do. (I really respect NetMeg by the way!)
  • souvikmukherjeewould have loved to but not always possible, specially when you are working with small businesses. Small business clients usually have stingent marketing budgets so you may not get the luxury of experimenting at your own cost.
  • Adrac_Ltd Yes. So that we can target keyterms appropriatly on the website and in the link building campaign

What We Learnt From A Pills Link Hacker

This post is a first for me. First time there is a guest post (well semi-guest) on this site. It also is my first collaboration with one of my favourite Research SEOs Neyne.  Neyne (Real name Branko Rihtman) doesn’t blog very often, but when he does it is always worth a read. This is a two part post, the first by Neyne, with the second part by yours truly.

My last post was about using WordPress Plugin Flaws to link build, “aka soft hacking”. However what we are about to demonstrate is another opensource CMS, Joomla, has just as big a flaw as WP. We didnt investigate the backdoor, or how it was done, however we do demonstrate the extent to which it works.

Worse Than Blackhat, Meet The Hacker SEO

Just like with “SEO is Dead” debate that raises its lame head in seemingly regular intervals over the past few years, so does its not-so-distant cousin, the “Whitehat vs. Blackhat” debate. There has been one raging on the popular blogs in the last week or so and, just like with its useless relative, this round did not bring any new arguments nor has it convinced anyone on the either side of the argument. However, not often does one get to encounter a true black hat campaign, one that leaves you with no doubt as to whether it is useful or not nor whether it is illegal or not. Thanks to a tip from one of my SEO buddies, I have taken the glimpse into the eyes of the beast, and it ain’t pretty.

Just before we dive in, I want to make something clear. I don’t usually out websites or SEO techniques. I think that outing is a cowardly practice, done by people that are not capable of outperforming others. Or in the immortal words on one of Aaron’s tshirts: “I have a very high tolerance for spammers, but a very low one for weasels”. That said, the techniques outlined in this article are most probably illegal (not a lawyer, so don’t want to be definite on that one). They include hacking into other people’s sites, flagging them as pill-related, squandering their link equity and eventually getting them flagged as compromised in Google SERPs, thus seriously decreasing their CTRs. Asshatery like that should be eliminated and I feel no remorse for doing so.

It all started with an enquiry of the mentioned friend about one of his client’s sites. The site seemed to be OK, nothing irregular about it; however, when looking at the Google cached version of the site, a footer appeared:

Pills Footer

Pills Footer

This footer does not appear when the site is visited with Googlebot useragent, so my guess is that this is a case of IP cloaking. The more interesting thing is that none of the sites linked in the footer seem to be V1@6r@ related.  They are regular sites on a wide range of topics. So my first thought was that this is a hatchet job – a slimy SEO company that is trying to ban their competitors by creating thousands of artificial, spammy links on hacked sites. However, when looking at the source code of Google cache of each of the linked sites, a different picture started to emerge. Check out the differences between the <header> element as it appears on the live site vs. how it appears in Google Cache:

Google Cache Header of Haked Site

Google Cache Header of Haked Site

So my next question was whether these site rank for any of the linked phrases. Almost all of them did. Check out this SERP for [V1@6r@ price] (6600 Global Exact Match monthly searches)

Ranking for V Price

So here came a head scratching part. It seems like someone is hacking into Joomla based sites, planting links in their footer to other hacked Joomla sites, whose header is cloaked to show V1@6r@-related keywords. But what is the point? Why would someone send V1@6r@-relevant traffic to totally unrelated websites? Then I clicked through to the site from the above SERP. This is the site I got:

Now you See It

Now you See It

If you go to the site directly, by typing the URL into the address bar, this is what you get:

Now You dont

Now You dont

So not only are they doing IP cloaking, they are also doing referral cloaking to show all visitors referred from Google SERPs .  Here is a partial list of sites, with their original Titles, hacked Titles, keyword they targeted with footer links anchors and their ranking on Google.com for that keyword:

Partial List of Hacked Sites

Partial List of Hacked Sites

There is one thing that is common to all the websites in question – they have been all created in Joomla. Furthermore, it is easy to target them as there is a clear indication they are Joomla based in their header:

<meta content="Joomla! 1.5 - Open Source Content Management" />

***********Investigation Ends*************

Search Volumes for v1@6r@

Search Volumes for v1@6r@

So Neyne has shown you the what, how and why. Hacking these many sites for those rankings isn’t an easy job, unless you prebuild in hacker doorways as I demonstrated in the WP Plugin Security fail. The only other way to do this is to run a number of brute force scripts on known weak spots of various servers and CMS’s. I want to show you what I learnt from investigating those links with Neyne. Like I said with the JC Penney scenario, when you get a chance to learn, do it.

10 Things I Learnt About The V1@6r@ Link Hackers

1. Old spam tactics still work

A while ago, I wrote about Spam Tactics, Then and Now, where I identified a number of tactics that still work. This discovery reinforces what I learnt back then, that old spam tactics dont die, they just resurface. And that Google isnt really as sophisticated an algo that people believe it to be.  Some of the points below take this into more detail…

2. content is not king

None of these sites that we investigated were serving up content that was V1@6r@ related. Of course quite a few had cloaking which meant that some conteant was being shown, but after investigating a number of these sites, not all had redirection or cloaking set up as yet.  And as a result just had links that were doctored.  So why did they rank for these keywords?

Just links. Links, links and more links. What about great content? Nope. Links.

Using Majestic, lets look at what the links could be like:

Look at all those links! (click to view Majestic data)

Look at all those links! (click to view Majestic data)

3. anchor text over rules all

Wordle for Links

Wordle for Links

Relevancy, thematic links, semantic analysis etc etc can all go to pot if you are working with a large scale access to link text manipulation system. Doesn’t matter where they are placed, and doesn’t matter where they came from.

An advanced analysis of the anchors for some of the sites we looked at gave you the wordle above  – you can see how heavy the manipulation is. In raw terms:

Anchor links Count

Anchor links Count

4. footer links work

For a while SEOs have been devaluing the relevance of links in footer or common elements – ummm they seem to work.

5. sitewide links work

Again, we get arguments that the value of sitewide links have been dampened greatly. Not when you are working in volume, as we discovered when we investigated these sites.

6. referrer cloaking still works

I think Neyne demonstrated this pretty well above.

The fact that referrer cloaking works is evident from the fact that the hacked sites are ranking even though they serve different content to users coming from Google SERPs

Another spam tactic from the past, still live and well.

Scripting, its an Art

Scripting, its an Art – this one isnt. (this is a tracking script on one of the sites)

7. i need to set up alerts

What really shocked me is that these site owners still haven’t realized that they rank for these keywords. If you suddenly rank for or get traffic from didgy keyphrases, its time to check WTF is going on. Now in the case of user agent redirection, sometimes analytics will not record those visits. But will most certainly show up for high volume impressions if you are signed in with Google Webmaster Tools.  AND they have a malware detection piece on there which is worth looking at once in a while.

8. i need to monitor catch all accounts

Google does try and email those sites that they have flagged up :

Site Compromised

Site Compromised

Site Compromised on All Accounts

Site Compromised on All Accounts

But you need to monitor and even set up catch all email accounts: You can find out if your site has been identified as a site that may host or distribute malicious software (one type of “badware”) by checking the Dashboard in Webmaster Tools. (Note: you need to verify site ownership to see this information.) We also send notices to webmasters of affected sites at the following email addresses for the site:

  • abuse@
  • admin@
  • administrator@
  • contact@
  • info@
  • postmaster@
  • support@
  • webmaster@

9. edu sites need some serious help

As part of the investigation, I had to scan a large number of SERPs for v1@6r@ related keywords. The most common resulting domain extension? That would be the “.edu”.  Google and/or someone else needs to teach these guys how to secure their sites… It’s not hard to spot the volume of hacking – see this simple query.

Or look at this gem:

edu Ranks for Buy that stuff Cheap

edu Ranks for Buy that stuff Cheap

.gov Sites Are FUBARUS Gov Search - Uncle Sam

Another common domain  extension that shows up in the SERPs is the .gov extension.   By the way, did you know google has an old search page that only looks at Government sites? Look what I found through it: http://bit.ly/dOlzKR

SERP Sniffing – A Long Tail Keyword Strategy

The Art Of SERP Sniffing

SERP Sniffing is a technique that has been used by a number of thin affiliates, blackhats and spammers to identify profitable long tail keywords to optimise for. Typically this technique charts thousands of easy pickings across the SERPs to bring in long term, scaleable traffic.  I would like to explore this technique, and demonstrate how it works, especially since I tried it myself to prove that it can and does work.

However, it is necessary to define the long tail and issues associated with it before going on to the technique itself.

What’s the most difficult part of SEO for the long tail? In my opinion there are two parts that make long tail strategies difficult:

1. Identifying Long Tail Keywords

By definition, long tail keywords make up the multitude of variations in any keyword target campaign. This means that there may be any number of strings attached to the original set of keywords / keyphrase to make up a number of sub sets, which could further branch off into sub-subsets. Most of these are independently low volume, but combined make a huge share of related traffic that sites ought to target. However, because of their low search volume, most of these keywords do not show up on most keyword tools.

Keyword Longtail Explained

How do you identify these? Or do you blindly pursue keyword variations by aptly stuffing your target pages with a number of variations that you can think off?

2. Identifying Ranking Opportunities Amongst Long Tail Keywords

So let’s say you have somehow compiled a list of X,000,000 Longtail Keyword variations. Well done. Awesome. Your keyword skills rock. But which ones should you try and target first? How easy is it to rank for these? As a rule, long tails tend to be easier to rank for, but assuredly, you won’t be the only person trying to get those rankings either.

How do you work out how quickly you can rank for KW X over KW Y without carrying out detailed SERP scraping exercises to work out some sort of value model in scale?

The Problem In Decision Making

So ideally you want to work on the easy ranking keywords first and then worry about the rest. Or you may want to work on the pot of higher combine traffic value long tails first. Either way you need the data that shows:

  • Ease of Ranking
  • Potential Traffic per LTK (Long Tail Keyword)

Neither one of these are easy to define, nor is there a ready guide where you can grab those numbers from. So how do you go about defining a detailed long tail strategy that is based on “real numbers” with regards to traffic and rankings?

The way to decide would obviously require real figures, real potential, in order to define priority. After all isnt it about profitability? Time is money and all that? Can you really waste time chasing after rankings that dont actually have traffic potential?

Show Me the Money

The Spammers Guide To SERP Sniffing

Warning: This is a HIGH risk strategy that may get you banned, and I don’t actually advise it. The following technique is for educational purposes only, and I do not condone Search Engine Spamming.

As I have discussed in my previous posts on Black Hat SEO, and SEO Automation, there are some industrial level methods to drive 1000’s of rankings fairly easily. This is easier when targeting the Long and Very Long Tail traffic, however the strategies aren’t sustainable as they are prone to creating “Burn Sites” which may gain short term rankings but not long term sustainability. This is because Google algo does recognize such sites and penalizes them, or “deprioritises” them in the SERPs.

However, short term rankings and traffic are great too. Not for sustainable businesses, but for research for sustainable businesses. Imagine if you raked in all the relevant data that these “burn” sites gave you? Then used them on legitimate sites? Thats what SERP Sniffing is.

Utilising Gray / Blackhat techniques to research SERP weaknesses so as to exploit them for Whitehat Purposes.

So How Does It Work?

Well to start off with, take your Sets of Two Word Phrases. Categorise them logically as you would in the absence of data, into their long tail targets.

In essence you could have:

  • Blue Widgets
  • Red Widgets
  • Pink Widgets


  • Phrase + Location
  • Phrase + Review
  • Phrase + Buy
  • Phrase + For Sale
  • Cheap + Phrase
  • Free + Phrase

Now you set up an automated site, where you pull in content in some real volume on the Categories and Sub Categories related to your Keyword Sets as identified above. Make sure you scale the operation such that the posts per category are coming in fast and hard once the site is indexed.

Use a “burn” link network to scale up back links (these only work in the short term and are also easily penalized).

Using your analytics, you should be able to identify keyword combinations that start driving traffic – in my experience such sites die out in periods between 2 – 14 days – and as a result you need to run daily exports of keywords and run ranking reports against those keywords.

Once the site has been burnt, you now have data:

  • The Keywords that drove traffic
  • Identified SERP positions for such keywords.
  • Ease with which positions were garnered.

Real Example

I ran this experiment on a site that could in no way be linked to my main sites, keeping the domain reg, domain ownership, hosting etc all different to anything that could be linked back to me, either via the algo, or manual human review. The site is defunc and the niche which I ran it for is one I dont work in. This was purely an exercise in experimenting. I didnt try to monetise the site.

I picked a Keyword Set of 4 two word combinations, further broken down into 6 subsets, which made my total categores into 24.

Site ran for a total of 15 days from indexation, and started bringing in traffic. See the Traffic Spike:

SERP Sniffing Traffic Spike

The strategy identified over 2,400 keywords that drove traffic to the site in the days it ranked.Think about this. I had an original target of 24 keywords (categories!). Automating these with random, yet related content multiplied the keywords into a data set of 2400. Thats nearly 100 variations per phrase!

Keyword Traffic Rank Breakdown

Cross referencing these keywords vs the SERP rankings showed that over 90% of these rankings were on page one. So I have a pot of 2400 Keywords that drove traffic, with 60% in positions between 1 and 4 that drove traffic to my site in a space of 14 days.

Now dont assume that the traffic that comes in in this method is crap either – check out the page view stats:

Traffic Page Views

Assuming that this is the normal trend of traffic for this category, if I maintained those rankings on a legitimate site, with good quality content, for those target keywords, with pages dedicated to these LTKs, then I stand to gain [(7100/14)*365] 185,107 page views annually. ( I am not disclosing traffic data – sorry!)


As I demonstrated above, the technique does work. I dont advise it, for obvious reasons. However, compare YOUR Long Tail Keyword research vs this method – are you surprised that Spammers, Blackhats, Feed Affiliates can still profit from the SERPs? your data is based on intelligent, but guesswork. Their data is based on real opportunities. guess who prioritises limited resources better?

Negative SEO – 4 Killer Strategies To Look For

Gutted I couldn’t make it to both the sessions at 10:30, I had to miss Tom Critchlows talk, Advanced Analytics for Affiliates. For those of you who know Tom, you know he really gets analytics, and in fact he recently posted for the google analytics team, no mean feat! I hope to catchup with him later if he is willing to share any tips.

Continue reading “Negative SEO – 4 Killer Strategies To Look For”

Content Farms – The Who, What, Where and Why

The name “Content Farm” kind of describes it perfectly. What a strange concept, isn’t it? Or maybe not. Spammers and BlackHat SEOs have been auto generating low quality content for long tail search engine rankings for a while now. The content farm technique arguably takes this a few steps further by creating better quality (note – still questionable quality), user friendly content for the exact same reason.

Continue reading “Content Farms – The Who, What, Where and Why”

Manipulating Google Suggest Results – An Alternative Theory

Google Suggest is a Reputation Management Nightmare at times. A number of companies have been hit and hurt by results that show up with “Company Name + Scam” for example. The problem with those results is that when users see the suggestion, they are immediately tempted to click on them, as opposed to their original query.

Continue reading “Manipulating Google Suggest Results – An Alternative Theory”