The Bizarre Case of Recurring Ranking Drops To Page 2

A client’s site ranks for some keywords that aren’t core to any of its themes, and it keeps getting bounced back to page 2. This is my hypothesis of what’s going on.

The page ranks by fluke, really. It’s propped up by site authority and the unique content having attracted nice links. The problem is that the content only tangentially touches on the keywords it’s ranking for, but doesn’t quite satisfy the user intent of the query. As a result, the bounce rate is astronomical.

When the content was first published, it ranked around the middle of page 1. Over time, it dropped to page 2, and has since bounced around page 2, mostly on a downward trend. Until a recent link came through, it hovered around position 18.

That link bounced rankings back up to the middle of page 1. Temporarily. It’s since reverted back to page 2.

Why? I’m betting that it’s because of the bounce rate. The page is better optimized, onpage and offpage, relative to competing pages. But it doesn’t quite provide what visitors were hoping for when they typed the search. To be clear, the page’s title isn’t misleading, and should qualify the clicks, but the bounce rate is still very high. I have to guess that the bounce metric is what’s causing the repeat slumps.

In fact, anecdotal evidence from my logs, as well as periodic rank checks at a time when I cared more about ranking for ‘seo’, made me think that the continually changing content on my homepage affected my rank. Now I have an indication that it may have been self-reinforcing, as the homepage posts that weren’t about seo increased bounce rates. To give you an idea of the extent of this, I had been #3 behind 2 Wikipedia results, and I now am around the top of page 2, generally.

My feeling is that Google’s worked some pretty intelligent user metrics code into its algorithm. Yes, people have been saying that bounce rates count for a while, but no one really documented what it looked like in practice, nor shared any real proof that bounce rate was a factor of the algorithm. It was logical speculation, at least as far as I’ve read to date.

Tags: , , , ,


  1. Interesting stuff - Google has probably been rolling this into their algorithm for some time. It hasn't necessarily been reliant simply on their access to user behavior data from Google Analytics, the Google toolbar or other direct sources either - the simple use of a cookie to track whether the user comes directly back to Google to search again or try another result is telling in itself. The most recent evidence of this I read about was posted over at

    Comment by MikeTek - February 20, 2009 @ 1:49pm
  2. hmmm. How about you try to decrease the bounce rate? Embed a video, try to add images with sentences that include the keywords that you are ranking for (so you don't add them to the text)... According to the information you are giving here it looks as it is just a coincidence. There are too many factors that could influence the changes in rankings (addition of links, fact that you don't have the keywords on the page) You need to somehow connect between the change in rankings and the change in bounce rates and to show that you can increase and decrease the rankings by only changing the bounce rate. It seems like you have an experimental system almost setup for that, it would be pity not to. That is of course if you are interested in testing it out at all and don't have better things to do, like studying, client work, etc. :)

    Comment by neyne - February 22, 2009 @ 3:51pm
  3. I had to go back and read the post again - and you're absolutely right, Gab. He doesn't explain this at all. The bounce rate drops and so does his traffic - but the fact that the graphs correlate really doesn't tell us anything. It could be Google got the relevance better on the results - so they stopped ranking this site for keywords that were not appropriate. That would mean probably less but more targeted traffic. It's an interesting topic that I'd love to learn more about. The trouble is that most of what we will have access to is anecdotal evidence and nothing empirical...

    Comment by MikeTek - February 22, 2009 @ 5:15pm
  4. If you have automated traffic you can test it out :). Or perhaps you could try buying SU traffic, then redirecting to a SERP where you rank after the campaign is approved, and see if the bouncy traffic drops you lower. Thing is it would probably bounce off everyone, so that's not so good...

    Comment by Gabriel Goldenberg - February 22, 2009 @ 7:57pm
  5. I think the bounce rate has a signifiant role in SERPs position. You can visit your own site for a lagre period of time. In this way the media oh time decrease and the bounce rate too. ;-))

    Comment by optimizare site - May 13, 2009 @ 3:07am

Leave a Reply