The new link disavowal tools seem to create a unique, certainly unintended opportunity for SEOs to get short term AND long term results via “link laundering.” This is a new algorithm loophole it seems, spam which resembles the mafia’s (and Russian and Iranian governments’) money laundering tactics. I’d love to hear your comments on this.
The theoretical three-step process to link laundering – as yet untested – is as follows:
1. Use blackhat seo to rank fast.
2. Build whitehat links as you go.
3. Once whitehat links can sustain your rankings without the blackhat links, “disavow” the blackhat links (wink wink nudge nudge).
So link laundering is the same idea with links, as money laundering. It’s just like first making money from drug dealing, then laundering it through Swiss and Caribbean banks and reinvesting the proceeds into legitimate businesses to hide the origin. (Except that the moral issues involved with drug dealing don’t exist with blackhat SEO so long as you’re not spamming others. Ex.: Buying links.)
The net result is ranking faster than you could with whitehat SEO alone and longer than you could with blackhat SEO alone, and with much less risk than the normal blackhat approach.
a. Disavow too much and drop. But… Bing says “you should not expect a dramatic change in your rankings as a result of using the tool.” So it seems the risk here is limited.
b. Draw attention by disavowing too much, leading to a manual review.
c. Get banned before you can disavow the links… In which case that might just precipitate the disavowal.
Isn’t this rehashing others’ ideas?
How do you know you’ve got enough whitehat links to disavow the blackhat ones?
I’m not sure. You could
- Annotate your rank tracking charts as you build whitehat links, to keep track of which really gave you a boost. This works best if you’re building links to deep pages.
- Compare your whitehat link profile to competitors’ profiles. Focus on median and average unique referring domain counts, as well as median and average inbound links. Advanced folks might scrape this data and put it into spreadsheets.
- Run a test campaign for this first to test the effect of disavowing, and then use the test campaign’s data on your followup campaign.
- Split test with running two sites in parallel
The last two choices obviously require double the work, which is a bit dissuasive …
What do you think? Is my logic sound? Am I missing something? Please comment below!
Some related posts you might like: