How Even Established Sites Use Keyword Difficulty aka Efficiency Data

Author: Gab Goldenberg

[Edit: I try to post original stuff but I realize here that I'm adapting "second page poaching." It only occurred to me after writing my post, but when you're done reading this, check out Virante's posts on second page poaching, relevant data collection and an API. ]

While working on a huge site with 1000s of pages and keywords, I realized that I needed to prioritize what keywords to build links for.

The situation was different from a brand new site, because for a new site, all keywords are equally hard to rank for (except perhaps those in the domain), in that you’re not in the top 100 for anything. In such a case, you typically prioritize keywords by absolute search volume, then keyword difficulty.

Unlike a brand new site, this site already has pages in the top 100 – and especially the top 20 – for dozens of terms. I sawthis when setting up the rank tracking (for reporting purposes) in Raven Tools. (Normally I prefer Authority Labs for rank tracking for their great historical graphs. These let you see what links boosted you, assuming the links are published on sites crawled regularly.)

 Check page 2 rankings with equivalent keyword search volume for their relative keyword difficulty, using SEOmoz’s keyword difficulty tool.

Accessible to SEOmoz Pro members only as of this writing.  

Anyways, Stephan Spencer and the boys at Virante shared a tip that you should look for pages ranking in positions 11-20, e.g. on the second page, since they’re easiest to rank.

The site already has loads of authority links from across the web, having been around since the 90s. So what’s needed at this point is deep links with a mix of anchor text. You’re only a few links away from page 1 and traffic in that case.

And as the saying goes, Rome wasn’t built in a day. A site doing 1 million visits/month didn’t get there from trying to rank for “insurance” straight away, but from getting 50 visits a month here and 230 visits a month there. It adds up and snowballs, because as more people get to know you and find you in search results, more people link to you.

(Mike Grehan described this phenomenon years ago as the [SEO] Rich Get Richer, in a piece called Filthy Linking Rich.)

What do you do when you have a few dozen pages on page 2? With similar keyword search volumes? Look at the keyword difficulty. 

So unless there’s a tool that gives me keyword difficulty scores in bulk, I’m going to go through the terms 5 at a time with SEOmoz’s keyword difficulty tool and see what ranks in positions 2-20 and is worth building links for to top up traffic.

Like this post? Get my latest posts by email or RSS!

Sidebar Story

Leave a Reply