SERP Sniffing – A Long Tail Keyword Strategy

by rishil on February 8, 2011

The Art of SERP Sniffing

SERP Sniffing is a technique that has been used by a number of thin affiliates, blackhats and spammers to identify profitable long tail keywords to optimise for. Typically this technique charts thousands of easy pickings across the SERPs to bring in long term, scaleable traffic.  I would like to explore this technique, and demonstrate how it works, especially since I tried it myself to prove that it can and does work.

However, it is necessary to define the long tail and issues associated with it before going on to the technique itself.

What’s the most difficult part of SEO for the long tail? In my opinion there are two parts that make long tail strategies difficult:

1. Identifying Long Tail Keywords

By definition, long tail keywords make up the multitude of variations in any keyword target campaign. This means that there may be any number of strings attached to the original set of keywords / keyphrase to make up a number of sub sets, which could further branch off into sub-subsets. Most of these are independently low volume, but combined make a huge share of related traffic that sites ought to target. However, because of their low search volume, most of these keywords do not show up on most keyword tools.

Keyword Longtail Explained

Keyword Longtail Explained

How do you identify these? Or do you blindly pursue keyword variations by aptly stuffing your target pages with a number of variations that you can think off?

2. Identifying Ranking opportunities amongst Long Tail Keywords

So let’s say you have somehow compiled a list of X,000,000 Longtail Keyword variations. Well done. Awesome. Your keyword skills rock. But which ones should you try and target first? How easy is it to rank for these? As a rule, long tails tend to be easier to rank for, but assuredly, you won’t be the only person trying to get those rankings either.

How do you work out how quickly you can rank for KW X over KW Y without carrying out detailed SERP scraping exercises to work out some sort of value model in scale?

The Problem in Decision Making

So ideally you want to work on the easy ranking keywords first and then worry about the rest. Or you may want to work on the pot of higher combine traffic value long tails first. Either way you need the data that shows:

  • Ease of Ranking
  • Potential Traffic per LTK (Long Tail Keyword)

Neither one of these are easy to define, nor is there a ready guide where you can grab those numbers from. So how do you go about defining a detailed long tail strategy that is based on “real numbers” with regards to traffic and rankings?

The way to decide would obviously require real figures, real potential, in order to define priority. After all isnt it about profitability? Time is money and all that? Can you really waste time chasing after rankings that dont actually have traffic potential?

Show Me the Money

Show Me the Money

The Spammers Guide to SERP Sniffing

Warning: This is a HIGH risk strategy that may get you banned, and I don’t actually advise it. The following technique is for educational purposes only, and I do not condone Search Engine Spamming.

As I have discussed in my previous posts on Black Hat SEO, and SEO Automation, there are some industrial level methods to drive 1000’s of rankings fairly easily. This is easier when targeting the Long and Very Long Tail traffic, however the strategies aren’t sustainable as they are prone to creating “Burn Sites” which may gain short term rankings but not long term sustainability. This is because Google algo does recognize such sites and penalizes them, or “deprioritises” them in the SERPs.

However, short term rankings and traffic are great too. Not for sustainable businesses, but for research for sustainable businesses. Imagine if you raked in all the relevant data that these “burn” sites gave you? Then used them on legitimate sites? Thats what SERP Sniffing is.

Utilising Gray / Blackhat techniques to research SERP weaknesses so as to exploit them for Whitehat Purposes.

So How Does It Work?

Well to start off with, take your Sets of Two Word Phrases. Categorise them logically as you would in the absence of data, into their long tail targets.

In essence you could have:
Phrases:

  • Blue Widgets
  • Red Widgets
  • Pink Widgets

Categories:

  • Phrase + Location
  • Phrase + Review
  • Phrase + Buy
  • Phrase + For Sale
  • Cheap + Phrase
  • Free + Phrase
  • ETC ETC

Now you set up an automated site, where you pull in content in some real volume on the Categories and Sub Categories related to your Keyword Sets as identified above. Make sure you scale the operation such that the posts per category are coming in fast and hard once the site is indexed.

Use a “burn” link network to scale up back links (these only work in the short term and are also easily penalized).

Using your analytics, you should be able to identify keyword combinations that start driving traffic – in my experience such sites die out in periods between 2 – 14 days – and as a result you need to run daily exports of keywords and run ranking reports against those keywords.

Once the site has been burnt, you now have data:

  • The Keywords that drove traffic
  • Identified SERP positions for such keywords.
  • Ease with which positions were garnered.

Real Example:

I ran this experiment on a site that could in no way be linked to my main sites, keeping the domain reg, domain ownership, hosting etc all different to anything that could be linked back to me, either via the algo, or manual human review. The site is defunc and the niche which I ran it for is one I dont work in. This was purely an exercise in experimenting. I didnt try to monetise the site.

I picked a Keyword Set of 4 two word combinations, further broken down into 6 subsets, which made my total categores into 24.

Site ran for a total of 15 days from indexation, and started bringing in traffic. See the Traffic Spike:

SERP Sniffing Traffic Spike

SERP Sniffing Traffic Spike

The strategy identified over 2,400 keywords that drove traffic to the site in the days it ranked. Think about this. I had an original target of 24 keywords (categories!). Automating these with random, yet related content multiplied the keywords into a data set of 2400. Thats nearly 100 variations per phrase!

Keyword Traffic Rank Breakdown

Keyword Traffic Rank Breakdown

Cross referencing these keywords vs the SERP rankings showed that over 90% of these rankings were on page one. So I have a pot of 2400 Keywords that drove traffic, with 60% in positions between 1 and 4 that drove traffic to my site in a space of 14 days.

Now dont assume that the traffic that comes in in this method is crap either – check out the page view stats:

Traffic Page Views

Traffic Page Views

Assuming that this is the normal trend of traffic for this category, if I maintained those rankings on a legitimate site, with good quality content, for those target keywords, with pages dedicated to these LTKs, then I stand to gain [(7100/14)*365] 185,107 page views annually. ( I am not disclosing traffic data – sorry!)

Summary

As I demonstrated above, the technique does work. I dont advise it, for obvious reasons. However, compare YOUR Long Tail Keyword research vs this method – are you surprised that Spammers, Blackhats, Feed Affiliates can still profit from the SERPs? your data is based on intelligent, but guesswork. Their data is based on real opportunities. guess who prioritises limited resources better?

Share and Enjoy:
  • Twitter
  • del.icio.us
  • Facebook
  • Google Bookmarks
  • FriendFeed
  • Sphinn
  • LinkedIn
  • PDF
  • StumbleUpon
  • Suggest to Techmeme via Twitter
  • Yahoo! Buzz

Rishi Lakhani is an independent Online Marketing Consultant specialising in SEO, PPC, Affiliate Marketing and Social Media. Explicitly.Me is his Blog. Google Profile

{ 2 trackbacks }

Effects of Panda on Thin Affiliate Sites
July 11, 2011 at 8:52 am
SEO for Longtail | John Doherty
January 3, 2012 at 2:01 pm

{ 36 comments… read them below or add one }

Alan Bleiweiss February 8, 2011 at 5:56 pm

Dude!

How long did it take to set this up? I can’t imagine it would be something most people could pull off, or that the investment would be worth it in most cases. There are so many ways sites can be refined that result in longer term value out there for the vast majority of sites.

Nevertheless, it’s great to see as an example of thinking outside the box!

Reply

rishil February 8, 2011 at 5:59 pm

It took me an hour and half, and cost £19 all in all…

At scale, if I wanted, I could set up 300 under 48 hours for about £450.

Reply

Alan Bleiweiss February 8, 2011 at 6:03 pm

ok I hate you for that. No actually, I hate programming. And I <3 you for your disclaimers :-)

Reply

Thos003 February 8, 2011 at 6:44 pm

So on the US $ exchange rate… how much would charge me? … Or how much to train my programmer?

Reply

Ryan Jones February 8, 2011 at 6:13 pm

Wait, where’s the SEO Panty Sniffing that Alan alluded to in his tweet? I’m let down. I’ve never heard of keyword sniffing though. I’ve always used paid search to accomplish that – but this sounds much much cheaper.

Seriously though, panty sniffing or not, that’s some killer stuff. Perhaps next post you’ll share your black hat link building techniques – like what you did to easily acquire a burn link network? Also, panties couldn’t hurt either.

Reply

Joanna Butler February 8, 2011 at 6:51 pm

Setting up test sites – whether white or black hat – is always going to be a plus. White hat takes longer but – unless you’re willing to use your black hat techniques on your main site – could give a better indication of the competition levels etc. relative to your usual SEO strategies.
There are also a gazillion environmental factors that makes the test site(s) differ from the main site. History being one. Resources for targeting these keywords is another. E.g. it maybe incredibly difficult to get content (via a big corp’s editors or via datasets) for all but just 25% of those long tail keywords you identified. Conversely, one tweak to a site and BAM – you’ve got a whole sub category created to target many more long tail keywords with little effort thanks to a relatively small site architecture tweak.
What I WOULD love to know Rishi – is how the long tail keywords you successfully ranked for and got traffic for compared with your PREDICTED “easy ranking” keywords given to you by keyword tools/your “guesswork” research and also how they compared to your most valuable/highest ROI keywords (page views != value). Now that would be some interesting stuff…
In essence, you say this is good for research for sustainable businesses, but you use “burn sites” that use page views as a signal of success but then die a death after 15 days :)

Reply

rishil February 8, 2011 at 7:05 pm

Lol. The burn sites are for data only. Once burnt, you take the data and port to main site..

As you rightly point out the strength of sites differ, but IF a strong site takes the data from a weak site and replicates on a White hat way, then the opportunity to rank is in fact higher.

Re the other data, will dig it out and add to the post, will take awhile to sift…

Reply

Joanna Butler February 8, 2011 at 7:26 pm

Data = #win of course. I’m just saying 2 potentially very different sites (I’m guessing here), 2 different SEO strategies, 2 different sets of KPIs… not much data there I could trust enough to invest my time and money in really. Or tell a client to invest in. As interesting, cheap and fun as it looked to do… :)
P.S. does this mean you’re coding now?! Hurrah! :)

Reply

Marty Martin February 8, 2011 at 10:03 pm

That’s awesome Rishi. You work in this industry and you’re continually amazed with the creative stuff folks come up with. Right on!

Reply

Cole Whitelaw February 8, 2011 at 7:50 pm

I ran a similar test with some really low rent methods, again spent about an hour in total to see where the decent kw niches were.

Generated about 2,500 variations from a couple of stem phrases modified with cheap, buy, for men etc.etc., then just used a quick python concatenation to generate a couple of rss feed URLs,( google base review search results, technorati SERPS) for each one. Then generated out a couple of sh!tty sites from that and fed in the rss entries as content over a couple of weeks – appspot was incredibly helpful for indexing all the new pages.

It did about 3,000 visits in 3-4 weeks then died a death.

Again, to echo Rishi this absolutely won’t work to rank long term but dumping out the referral kws and analysing the language patterns is gold.

Reply

Terry Van Horne February 8, 2011 at 8:38 pm

I commend you for making sure the risks are clearly explained. Agreed that this is much simpler than it seems…. given … you know the tools. The scaling is where men are separated from the boys. What I do like about the strategy is in 14 days you have conversion data and as you said real data to access the value if you were to say going to do a more long term strategy. Kinda’ brilliant in a diabolical way. Gotta think that also helps prioritize the short tail as well. ;-)

Reply

Gordon Campbell February 9, 2011 at 9:24 am

Good post. With a few of my clients, the words that convert into sales are ones we didn’t even try to optimise for, mainly long tail.

Reply

James February 9, 2011 at 11:47 am

Very interesting post thanks for sharing it, I think test sites are a absolute MUST for any SEO who wants to push the limits or try something a little but naughty once in a while. But you try this style of test once, their are people out their doing this one scale by the 1000s a day (whilst I do not know any I am sure their is lol). IT is really crazy but if money is to be made via CPM or affiliates it is going to happen.

Reply

Jill Whalen February 9, 2011 at 2:55 pm

Very interesting!

Rishi, I’m unclear as to what content you’re putting up, or where you’re getting it, for your burn sites. Is it just some sort of scraped content from a bunch of other sites?

Reply

rishil February 14, 2011 at 12:54 pm

Hey Jill! The content was from a cross combination of feeds, news, products, images, etc. Pretty much like blogspam, run through auto spinners…

Reply

Paul Argyle February 14, 2011 at 12:30 pm

How do you overcome the Google Sandbox? Putting up a new domain never gets traffic, even if you link to it with powerful authority links?

I think if it’s that easy, you’d be rich already, not only do you have a great set of keyword research, you’d also have a ton of free traffic which you could monetize. You got several thousand unique visitors based on hosting…..Where they pay $100 per CPA.

I doubt the honesty of this blog post, seems to be more attention grabbing nonsense in order to promote your own blog.

Nice linkbait sir.

Reply

rishil February 14, 2011 at 12:46 pm

Hey Paul, are you serious? You can’t get a “new” domain indexing quickly?

I know affiliates who use exactly this strategy and make thousands daily.

Sir, I suggest you read up on indexing domains quickly. Tip: learn about speed of indexing using blogging platforms, and about the related QDF algo that favours such spamdexing.

Reply

MOGmartin February 14, 2011 at 12:48 pm

@Paul Argyle

“How do you overcome the google Sandbox”

Seriously dude, Ive not seen a site get sandboxed in about 4 years. I regularly (ie. weekly) buy domains around spiking search terms, get them indexed and ranking (usually takes 3.5 to 4 days).

Its possible even for high competition terms using a combination of social signals (manufactured ofcourse) and high PR links (read, trusted PR7/8 Links, bought ofcourse).

You rarely see those domains last longer than 2-3 weeks in the serps though – but thats enough for a lot of spikey applications.

Reply

Joe February 21, 2011 at 3:35 pm

Good article, wish I had the skills and know how to do this, and all for 19 quid!

@MOGmartin “You rarely see those domains last longer than 2-3 weeks in the serps though” what happens to them after that? Do they get sandboxed or blacklisted by Google?

Reply

rishil February 21, 2011 at 3:44 pm

most get black listed – some come back though…

Reply

Keydaq February 21, 2011 at 9:40 pm

I’m guessing that’s down to the way that Google Caffeine estimates the value first then demotes it later once it’s sorted itself out.

Reply

MrSmith February 15, 2011 at 7:54 pm

I’m lost … get the basic idea though and it’s a neat concept … whether white/black hatted approach. Definitely gives people food for thought … Thanks for sharing an interesting experiment author.

Reply

Amit February 16, 2011 at 11:00 am

Hey Rishi,

Some good stuff out there. I am definitely going to “bookmark” the blog for such interesting reads. Not sure how open would you be, but wanted to know how did you got those “quick links”, content and the content spinner.

Will give a read to older post too. Good work!.

Cheers…

Reply

Gareth James February 20, 2011 at 10:30 am

Yep this definitely works, it’s surprising how much traffic you can get from even pushing out reproduced content via RSS. Google can still show your reproduced articles above the original with few external links. e.g. I just did a search for ‘Google brand signals’ to see a scraper site ranking #1 above the original SEOmoz post. Madness. I’ve been playing with a similar method to find traffic volumes by blasting pages (on burn sites) with dodgy links to get as close to #1 as possible before tripping filters, you can then find good terms worth targeting properly. This is often needed where the Google keyword tool gives zero data e.g. adult niches :)

Reply

Michael Charalambous March 2, 2011 at 3:31 pm

Nice work there, well…with regards to your analysis of the situation. Obviously the work was spam like. It’s certainly interesting how being a moral citizen with regards to the SEO world & web does not offer benefit over those who are happy to abuse it! Sad really…

Reply

Red July 22, 2011 at 12:23 am

@Rishil & @MOGmartin You guys must burn $1000’s on test sites a year – it kinda reminds me of that night time scene in Cliffhanger before the bunny gets shot. I guess if you can get your money back with affiliates and PPC referrals within 2-3 weeks then it’s a win win situation. You must have automated log setups keeping track of it every 5 mins – which begs the question would it be possible to dynamically turn PPC ads on and off depending on your logs?! [IF drop in rank / de-indexed, STOP, PUMP IT UP]
That’s it! You’ve inspired me now! Bring it!

BTW @MOGmartin – Slick seminar at linkbuilding London! When’s the next UK tour?

Reply

Sandy September 19, 2011 at 12:42 pm

Are you talking about the microsites? I do not understand what to do once I have a list of long tail phrases, i need to start link building or need to create small site that contain the phrase in its domain or sub domain name?

Reply

andy October 4, 2011 at 11:48 am

I love both you and tutorials and, if I wasn’t a man, I’d marry you and have SEO-babies with you!!

Nice! I was digging around the idea of burn sites and this is one of the most concise descriptions/how-to’s that I’ve found.

Cheers.

Reply

Leave a Comment

Previous post: Claim your Website on Facebook

Next post: Kinky Auto iPhone Correct