Social media is one of the most difficult things to justify in terms of ROI because current analytics aren’t well suited to measure its data. Here’s my proposal for social media analytics and tracking. This is an approach to use as a foundation for creating social media analytics tools, not a tool.

27/05/2011 – Update on Social Media Analytics Tools:

Since this post was written in 2008, lots have happened in social media analytics. For a social media measurement tool, your best bet is BuzzStream. It integrates Twitter with a Customer Relationship Management (CRM) tool so that you automatically track your relationships with Twitter contacts. This ties into a PR and social media tool (or a link building CRM). Read on to find out why this is the best kind of social media analytics…

Couple that is soon to be wed
(Soon to be wed by Simoty77/SLloydBottom)

Social media is a suite of tools for … socializing.

Therefore, the best measure of success in social media is how many relationships you have and how strong your relationships are.

Personally, I really enjoy linking to other people and/or submitting their stuff to Sphinn because I know it strengthens my relationships with them.

To reprise Sean Covey Jr.’s metaphor from his highly successful 7 Habits of Highly Effective Teens (reprised from his dad’s 7 Habits of Highly Effective People?), you’re putting a deposit into your relationship bank account. The more deposits, the richer you are.

The reason regular analytics aren’t suited to measuring this is that they collect clickstream data (word kudos to Avinash Kaushik). Whether we’re talking about logs or about javascript tags or sniffer packets, the data is what occurs in your visitors’ browsers. Social media analytics need to track what goes on in your acquaintances and friends’ minds. (Like the thought police, only different ;). )

That’s only a slight exaggeration. What I’m getting at is that you want to measure your network of friends and contacts. Since I’ve been getting more active and having more success on StumbleUpon, I know that my network is growing in size, and as I submit more content and thumb more up, in strength as well. Is your network a sturdy jungle gym or a flimsy spiderweb? (Jungle Gym by bazzmc.Jungle GymIf you’re going to measure your social media results, for now, the best you can do is sit down with a spreadsheet, write up your friends’ names and what you’ve done for them recently. Have you:

  • Connected them with useful contacts?
  • Sent business their way?
  • Linked to them?
  • Interviewed them for something?
  • Answered some questions they had?
  • Gotten them ahead in some way, shape or form?

I emphasize the recency bit. To quote the Japanese, “A kind word can warm three cold winter months.” (Which means that if you’ll see a higher ROI in Canada.) More seriously, this will help you assess where you stand because people forget favors quickly remember injuries extensively. I still remember which kids stole my hockey cards in grade school, and that was over a decade ago! Ask me who gave me a compliment, and I’d be hardpressed to tell. (Though that maybe because I was a bit of a loner back then.)

Another advantage of this approach to tracking is that it’s more actionable than clickstream data. Compare “I haven’t been in touch with Sylvain in a while…” vs “We got 348 visitors today.”

On that same note, I’d like to highlight that I’m far from being the only one discussing this issue. Get some further reading from the following folks, who address social media, ROI, and measurement, though none of them quite in these words, afaik.

Note: While Ann and Maki are pretty close to my thinking, I believe this post adds to the discussion (especially in terms of specifics, and pointing out the obvious which for some reason hasn’t been addressed yet) rather than just rehashing it.

  1. Maki – social-media-networking-and-roi/
  2. MindValley Labs – whats-a-friend-worth-to-you/394/
  3. Ignite (ft. Brian) – social-media-metrics-coming-to-an-algorithm-near-you-part-1/
  4. Huomah – social-media-marketing-is-it-for-you.html
  5. Huomah – the-Value-of-Social-Media-Marketing-Part-II.html
  6. SEOmoz in part – whiteboard-friday-tracking-nontraditional-conversions
  7. Annie at SEOmoz – creative-rss-button-could-it-work
  8. Annie @ home –
  9. Ann’s touched on it elsewhere (or was that just PMs?) but I can’t find it. You get the point though, the gal’s smart.
  10. Yours truly – seo-roi-tops-200-subscribers-case-study-on-feed-analytics-and-poll/ (update: tops 300 😀 )

To conclude, consider this illustration of the ROI you can get from social media.

Guy Kawasaki, when explaining how he was able to launch a site that got 200,000 pageviews its first day, said, “I spent 24 years schmoozing and paying it forward.”

And that, ladies and gents, was my proposal for social media analytics and tracking. Like this person said: Measure relationships, their strength and their growth.

On a related note, you may care to read this post on the measures and value of attention equity: How do you measure attention and what is it worth to you? If you think you’re likely to visit again why not just add my RSS feed to your RSS-feedreader?

Existing social media analytics tools are as follows. Note that they’re mostly quantitative, rather than qualitative.

To analyze Digg, Mashable lists 5 tools.

Marty has a great item on building a reputation monitoring dashboard. He also has buzz pocket mining tools for those interested in mining social media for keywords.

Some other miscellaneous ones include Keotag – for tracking tag use; Boardtracker – for forum conversations in particular ; Google Blog search, which lets you track blogs that mentions your desired keywords (e.g. Brands, products, etc.); and Radian 6, which Ben has reviewed.

The New Face of Reciprocal Links: Widgetbait

I recently got this email:

Dear SEO ROI Services author,

Our editors recently reviewed your blog and have given it an 8.2 score out of (10) in the Technology category of

This is quite an achievement! [If you say so, then it must be!]

We evaluated your blog based on the following criteria: Frequency of Updates, Relevance of Content, Site Design, and Writing Style. [So quality wasn’t a criteria? I guess I’m finally getting through to people how little quality content matters!]

After carefully reviewing each of these criteria, your site was given its 8.2 score.

We’ve also created score badges with your score prominently displayed. Simply visit your website’s summary page on

[Badge picture was here.]

Click on the “Show this rating on your blog!” link underneath the score and follow the instructions provided.

Please accept my congratulations on a blog well-done!!


Amy Liu


Matt Inman’s widgetbait for his new dating site got pre-emptively wrecked recently, in what most SEOs (see the comments on that post) thought was unfair punishment for someone starting out fresh in a legitimate, whitehat way. If we’re going to talk about widgetbait as spam [because it’s unsolicited and obviously worthless here], this here is the prototype example, not Matt’s fun quizzes and such.
Liked this? Subscribe by RSS.

SEO Research: Indexed … With No Links or Submission

I’ve just seen this in Google on an experiment an acquaintance of mine is running (she doesn’t blog on SEO, hence it being here; she also OKayed me writing this up). A recently registered domain, without having any links pointing to it, is now indexed.

There’s private registration, although because she didn’t immediately “activate” private registration, with Namecheap, it’s possible to see the old registrant info prior to activation. The person who registered it has some other trusted/well-ranking sites, so it’s possible that by correlating the whois info, Google is expecting that this will be a quality site and decided to index it.

Myself being a Namecheap client (thanks Smaxor for the free SSL tip), I decided to speak to support. They don’t autosubmit domain names to Google. So submission is not the reason the domain is indexed.

What’s also interesting is that when this domain – a nonsense keyword domain – was registered, there were 0 results for the made-up keyword. Now there are a few besides my contact’s domain; she and I have searched for it a couple of times. Perhaps Google [supplemental?] indexes pages that it wouldn’t otherwise index when it realizes they have unique keywords on them that are being searched for. Of course, that could easily lend itself to abuse, which is something Google’s engineers are weary of when deciding if a criteria is worth using.

A final possibility I considered is that there’s a trojan on my friend’s computer or mine. Keystroke logging might then have got spammers to find these keywords and autogenerate the junk constituting the rest of the SERPs [which, again, weren’t there when he decided on that nonsense keyword]. I think this is the least likely of the possibilities though.

Update: My friend Andrew Shotland has a post featuring over a dozen techniques to get indexed without links or submission!

5 Sure Fire Ways To Win Your SEO Clients’ Hearts

This is a guest post by Arnold Zafra for, a hosting company offering shared web hosting plans as well as dedicated servers. The English are a little awkward at points, but Arnold still shares some valuable tips.

So, you just snagged a deal with a client. Everything seems to flow smoothly and according to plan. You’re able to deliver the goods, but somewhere along the way, your client becomes restless and starts playing hardball.

And you wonder what could you be possibly doing wrong? Why is the deal running not so smoothly, where in fact you’re doing everything that you promised to do. How can you re-establish rapport with your client especially when your SEO deal is still ongoing?

Here are five easy ways of improving your relationship with SEO clients.

1. Check if you’re bringing the goods you promised

If you set a deadline to achieve first page SERPs ranking for your client’s site, check whether you have achieved it. If not and you’re already nearing the deadline, explain to your client your progress. And assure him that you’re going to achieve the agreed target.

If you think that you won’t be able to achieve the set target, explain to your clients the reason why you’re not going to meet the deadline and if possible ask for a reconsideration or extension. Then work doubly hard to make sure that you achieve the target the next time as earlier as you can.

2. Explain intangible SEO results/achievements

Use all the available analytics tools to explain to your client what your SEO work has achieved so far for the client’s website. Present site traffic trending, traffic spikes and downturns and site-related statistics in the simpler terms that can be understood by your client. These results/achievements may not be fully understood by your client so much so that the client may think that you are not doing anything. So, it is of utmost importance that the client understands the nature of SEO work.

3. Listen to the client’s follow-up requirements

If in case your client demands some extra output from you, gladly listen and find out whether what is being asked for is achievable and within reason. Avoid replying in a negative way, unless you want to lose your client’s trust. If what the client is asking for is not achievable, explain the prerequisites to achieve it and negotiate whether the client is amenable to amending your SEO contract to reflect the changes in the client’s requirements.

4. Explain what you are doing in the simplest layman’s terms possible

We got to admit that the SEO field is full of technical jargon that only we SEO workers can understand. Hence to improve your relations with your clients, it is a must that the two of you are in the same level of understanding in terms of general knowledge about SEO terminologies.

5. Establish a personal relationship

Many people will tend to disagree with this point. And there’s nothing wrong with that since SEO is still a business endeavor and so client relationships must remain at the business level. But then, the SEO field is still an informal industry characterized by workers who go against the norms if those norms are not technically related. So there’s nothing really wrong about establishing good relations with the client. And to achieve “good client relations” that relationship may or can extend to a more personal level.

Being friendly towards others still works, no matter how old a business trick it is.

It’s no secret that this auberge de jeunesse in Montreal, the Auberge de Paris (I realize the name is unusual), is a client of mine. For a while now they’ve had issues with their reviews being merged with their sister downtown Montreal hotel‘s reviews.

The problem is that since the youth hostel’s reviews tend to score lower than the hotel’s reviews, on average, the hotel gets lower review scores. The result is that the hotel unfairly gets a bad rap.

(The hostel doesn’t get an offsetting revenue benefit because (i) hostel prices are lower and (ii) the Auberge de Paris doesn’t get shown in the Google universal maps results (in fact, only one youth hostel in Montreal does, oddly enough). )

So what causes these merged review listings?

There are a variety of possibilities, but the one that seems most logical to me is confusing external links. I was recently doing some competitor analysis and found an old link of the hotel’s on a page where its competitors also got links. The link featured the keyword “auberge,” which in French means inn. (An “auberge de jeunesse” is therefore a youth “inn” – a youth hostel.)

Auberge anchor text for hotel - montreal .com

From having been with the hotel for a couple of years now, I know that there are other links of a similar nature elsewhere, featuring youth hostel anchor text. So I’d say it’s fairly likely that this is the causa causans or at least one of several causes.

The implications if this hypothesis is correct, as far as SEO goes, is that competitors could manipulate your search marketing results. They could build/buy links with messy anchor text to

  • merge your reviews with their own (for extra publicity on your branded search terms) or
  • merge them with a business in a different industry and thus reduce your sales as customers get annoyed due to the wrong numbers etc.

Note: I’m not claiming credit for the hypothesis – I believe that Mike Blumenthal said it somewhere only I can’t find where (it might also have been Mike Belasco or Andrew Shotland ?). If you know who’s behind the idea, do let me know in the comments. You’ll get a link like Antonio from Marketing de Busca earned.

Another possibility is the co-citation with other hostels. As I mentioned, that link is far from being unique. So it’s possible that Google kept seeing links in a youth hostel context and the co-citation lead it to think that the auberge and hotel are one and the same. Aaron Wall used to see his name offered up in SERPs as a related search to Traffic Power when he got sued by them; his hypothesis is that it was due to many common citations with TP.

A related but slightly different possibility is that the links to the hotel featuring the auberge keywords are often accompanied by the hotel’s pricing. Ergo, it’s really one and the same product just operating under two different names. The cause, in this case, could be described as a more advanced form of co-citation that is similar to how crawlers look for address info on a website for geolocation purposes. This could be perceived, as Mike Blumenthal suggested, as an attempt to avoid mapspam. “It is possibly a spam control strategy to prevent multiple optimized listings from the same business.”

A fourth possibility is that the two businesses share the same front office. People staying at either the hotel or the hostel check in through the same front office. In fact, both websites cite the same address. Publishing your address on your site is a common local SEO tactic, and this could easily be the source of the trouble.

Except that it’s not up to the Hotel and Auberge de Paris to rectify this problem – it’s up to Google. It would be totally unreasonable to have this small hospitality business add another front office and associated staff just for the sake of changing the address listed on the site for Google. Not to mention that Google is always encouraging site owners to act like it doesn’t exist. But then, maybe I’m being shortsighted; does anyone have a workaround to suggest/share?

What do you guys think the problem is? If you’re a Googler and you read this, some help/rectification/clarification/separation of the listings would be great. And please don’t tell me to try this in Google groups – that’s proven completely useless. Besides that, if you guys liked this post, you might care to subscribe to my RSS feed (link goes to XML, not Feedburner).

Update: I forgot to mention that Mike Blumenthal covered this earlier:

Unfortunately, no word from Google on the real cause of the problem there nor have I seen it since. Hello? Webmaster communication effort? You guys there?

Internal Link Building Update

Author: Gab Goldenberg

A few of you have been kind enough to report bugs and issues with the internal link building plugin. I’m currently aware of the following issues. They’re on the to-do-list to be fixed and/or improved. I wasn’t sure whether or not to hide these bugs or be upfront about it, but I figure it’s better to fess up and be transparent than whatever might happen in the alternative.

Keep in mind, however, that some problems will be specific to you personally. Try de-activating other plugins before you come to report a problem. If that doesn’t work, then please, I really, really want to hear from you!

Also, if you’re on version 1 of the plugin rather than version 2, get version 2
now as it fixes a common bug (see 2 below) and may address yours. You don’t need to worry about losing saved keywords; they’ll be there when you install the new version. This is per my testing on a 2.5x WP install on this site.

1) Apostrophes are not being accepted as part of a keyword. Problematic for anyone that wants to use a possessive (e.g. mark’s internet marketing) and for many Irish folks.

2) Words that are set as global keywords, when appearing within URLs, get linked. E.g. “”  appearing as the text will have cats link to the page set for ‘cats’ and ‘dogs’ set to the page for dogs. This bug also affects trying to set something like ‘idea’ to link to THIS BUG IS FIXED IN THE LATEST VERSION, courtesy of Chris Balicki of Web Systems SEO / reklama internetowa. Download that and install it if you have any problems.

3) Linking from the destination page to itself. I.e. my SEO services page links to itself. While nothing is technically broken, it’s kinda dorky and shabby usability for less-savvy users who won’t look at a link’s destination before clicking.

4) When words appear within blockquotes or lists they’re not getting linked. I checked this myself and there’s no consistency in the plugin; sometimes the links appear, but mostly they don’t.

5) It looks like the randomize functionality duplicates links. Per Chris Cemper: “If we specify 6 links and place 3 per post, we certainly want 3 unique links, not 3 times the same ones… that would look spammy/unnatural.” I personally could not replicate this, so I don’t know what to do about it.

6) On a vanilla WP 2.6, Chris got this problem

Warning: array_merge() [function.array-merge]: Argument 0000001 is not an array in C:\!work\!testbed2\wordpress26\wp-content\plugins\internal-link-building\Internal_Link_Building.php

Err, crap. Anyone else get that/know why that’s happening?

7) There’s no facility to do this in bulk. I’m going to speak to my dev about having some CSV import/export functionality so that multiple keywords/links can be done in one go.

Other requests I’ve had:

1) Simple Tags/another plugin compatibility. If I wanted to make my plugin compatible with every other plugin out there, it would be a money sink. Sorry, but this isn’t happening.

2) No follow functionality (e.g. make a link use nofollow). Not sure why you’d use a link building plugin and want it to do nofollow too. If enough people chime in in the comments, I may have this added. I’m a lot more likely to do this if you offer monetary support through PayPal. But I think you guys could stop citing Wikipedia so often (you know who you are) and use the Wikipedia-nofollow plugin if that’s the concern.

3) Ability to set multiple keywords to link to a particular page. I’m thinking this will be integrated with the bulk feature above. At first, I was reluctant but I just set up about 60 phrases using the tool and it can get tedious if one of your pages targets multiple keywords…

Finally, if you guys would like to help accelerate the development of the plugin and the fixes named above, please leave a comment and what you’re willing to contribute. I’ll email you a money request through Paypal. Supporters will be acknowledged.

How Do I Make The Logo The “Second Link”?

Author: Gab Goldenberg

Reading the post-Common SEO Mistakes: CSS Image Replacement, I found myself criticized for making the logo here an H1 tag, as well as for making it the first link.  

First, I recognize my mistake on the H1 point.

That was done years ago when I didn’t appreciate fully the distinction between sites and pages and wanted to rank for SEO on the homepage. The H1 is acceptable for that purpose, but it’s a sitewide element – other pages are also getting “SEO ROI” as an H1.

Oops! Thanks Boston SEO !

Second, the first link question needs some context for newer SEOs. Various studies have been done (eg by my friend Branko, the SEO Scientist) on whether all links on a page pass value or just the first.

Some say only the first does, others say all do, and maybe there’s a possibility the first passes more value. So best to make the logo the second link, after a more anchor-text optimized link.

Well, I don’t recall the studies in-depth, but I think the context was external  – not internal – links. Also, the testing by Branko says all links pass value. Third, I can spend time on more productive things.

If I couldn’t spend my time better and did believe that internal links, specifically the first one, pass more value, then the solution is simple: Use CSS to position the logo in its place, but have the code for the logo appear below the first properly anchor-optimized link on the page.

If you liked this tactical discussion on SEO ideas, add my RSS feed to your reader for more! And check out my upcoming advanced SEO book – you can download some free chapters!

How Navigation Peekaboo Converts SEO Traffic Better

Author: Gab Goldenberg

Standard conversion advice says remove navigation from the landing page (at least for lead generation landers). Standard SEO says use links. The next best thing would be to put the navigation out of sight, in the footer, and have your calls to action above that so visitors won’t use your navigation to leave your page.

But if you put the nav in the footer, it might get less search engine trust, precisely because folks don’t use footer navigation. What’s a conversion minded SEO to do? Here are 4 options to play navigation peekaboo with search engines and humans to convert your SEO traffic better. Blackhats use some forms of this, but I think I’ve also thought of original twists too.

Cloaking Update – Since this wasn’t 100% clear, cloaking is disapproved in most cases by search engines. Use this at your own risk.

Step 1. Cloak the page to search engines and show them the navigation bright and early.

Step 2. Show humans the navigation waaaaay down the page, which they mostly won’t get to because your awesome calls-to-action are getting them to click before reaching the footer.


This avoids the risks associated with cloaking, but will likely hurt your conversion rate.

Option 1 Location Rotation: Change the navigation’s position on the fly depending on the referring source. If it’s a search engine, the links are where they should be. For everyone else, the navigation is in the footer.

Option 2 Camouflage: Have the navigation blend in with the background (e.g. white on white text) or else have the navigation “collapse” (e.g. what happens when you click minimize – the minus/dash/underscore – on a regular browser window) automatically when humans visit. I’d guess that the white-on-white can be easily detected by search engines though, so I’d be a little weary with that one.

Option 3 Peekaboo: Another possibility I just thought of is to use DHTML or complicated javascript (e.g. search engine illegible) to immediately place a background-blending graphic over where the navigation would normally appear. Kind of like playing peekaboo – you put your hand in front of someone’s eyes and then other people can only see your hands.

Author: Gab Goldenberg

Are domain names the internet’s real estate? Can keyword research be considered intellectual property? I put these and other questions to Eric Goldman and Mark J Rosenberg, both of whom are speaking on SES San Jose’s legal panel. (Clarification: This is an interview, not coverage of an SES session.)

(Image: Jason Burrows)

Note: This post only offers information. It does not offer legal advice, counsel or opinion, and should not be relied upon for such. Speak to a lawyer who is a member of your local bar if you need counsel.

1) Are domains really an equivalent to real estate? I asked my real estate prof that and he highlighted a variety of reasons why they aren’t. Ex.: No rights-of-way, usufruct, and other such items.

Eric Goldman (EG): No, domain names are not virtual real estate, and I always chuckle when I see the domainers bloviate about this analogy. Personally, in almost all cases, I’d rather own the top spot on Google for a keyword than to own the I expand on this point more in this lengthy article: [Requires some kind of subscription; password confirmation email not hitting my inbox so I’m not linking to them. Talk about SE friendliness… Abstract looks interesting though!]

Mark J. Rosenberg: He [your prof] is right. Domains are like any other personal property. No different than a baseball bat. They can be bought and sold without any special terms and conditions. If used incorrectly, they can cause damage.

2) Can SEOs be considered fiduciaries/trustees for their clients, at least with respect to the site? What criteria would militate in favor and what criteria would oppose such a characterization? Is implementing non-best practices a breach of trust that can constitute a cause of action?

EG: No, I think the analogy to fiduciaries or trustees is way too strong in most cases. In some cases, SEOs can be “agents” of their clients and have the power to bind their clients to legal commitments, but this is probably not the case with the standard SEO-client relationship.

EG: However, like any professional service provider, SEOs could face malpractice or related claims for botching their jobs. I’ve never seen an SEO malpractice claim but I suspect they will come in the future. Even then, the standard for malpractice will be loose (i.e., general incompetence will probably not be deemed malpractice; it will take a higher degree of incompetence to be actionable).

EG: In all cases, an SEO who does a lousy job will develop a poor reputation and suffer some economic consequence accordingly. Also, this issue becomes less important to the extent clients are paying performance bonuses or other compensation tied to results.

MJR: [Whether the SEO is a trustee or not] is most likely a contract specific issue. Unless specified in the contract and/or SEO given special responsibilities, the answer should be no.

MJR: Some of the criteria that would militate in favor are handling money, extensive control and/or operation of the site including content. Those that would oppose such a characterization include an SEO acting as a consultant merely providing advice, and having limited or no control over site or its operation.

MJR: [As to breaching a trust by forgoing to implement best practices,] This is case-specific. For example, in some situations, there may be a legitimate reason for not implementing a best practice. Otherwise, I would say that depending on the extent of the violation, there may be a claim for breach of contract.

3) Keyword research – can it be considered as intellectual property? If an affiliate manager gives yours away, or a search engine does… are they violating your rights?

EG: It’s possible for the insights derived from researching keywords to be protected under IP. Most likely it would qualify as a trade secret (if it qualifies as IP at all). In order to remain a trade secret, the SEO would have to require everyone who has access to those insights to agree in writing to protect their confidentiality. This can be tricky–especially with the search engines, who aren’t promising anything!

EG: Note, however, that depending on an SEO’s contract with its client, the contract may transfer ownership of the SEO’s work (including their keyword research) to the client. Clients are often as concerned about confidentiality as the SEO, so this can set up some conflict during the contract negotiations. SEOs really interested in protecting their work product, including their keyword research, need to retain an attorney to help them build the proper legal
protections into their contracts and operations.

MJR: It can be considered a trade secret if it is treated as such. If given away without permission, there is a definite violation, provided that it was a trade secret in the first place.

4) What is the standard of care SEOs need to employ in carrying out work for their clients?

EG: As I mentioned, I don’t think the standard of care has been defined in the courts, and I expect it will take some time before it is. In all cases, the legally defined standard of care will be less rigorous than the marketplace expectations of SEOs’ ability to deliver results. Add value to your clients, and you will be richly rewarded. Chunk it, and your phone will stop ringing.

MJR: There is no special standard. It is to abide by contract terms to the best of their ability while governed by the usual implied covenants and standards under the UCC [Uniform Commercial Code, a US inter-state legal convention aiming to harmonize signatory].

If you liked this post on search marketing and internet law, get SEO ROI Services’ RSS feed.

16 Content Network Guides & Tips

I’ve been reading up on the Content Network before launching the first campaign there in years. The following’s a list of posts I’ve found useful and the key takeaways. They may repeat and overlap in some ways. That repetition is summarized below. The links are accompanied by notes on what’s unique in that guide.

Adsense Unit from Google Content NetworkGoogle AdSense unit image courtesy of Frank O’Dwyer.

All these guides broadly suggest:

1) Don’t just copy search network campaigns to the content network

2) Your content network ad groups should be formed with modifiers added to your base keyword/phrase. E.g. Air Jordan Basketball shoes | Nike basketball shoes | Nike high-top Basketball shoes

3) The ads have to shout, not speak politely. This is because people are in the info-gathering mode, not search mode. You need to INTERRUPT!

4) Content quality score = CTR

5) Individual keywords don’t trigger ads, it’s the ad group theme that determines where ads are syndicated.

6) The corollary to #5 is that tracking happens at the ad group level. If your campaign is small in keyword #s, you can have 1 keyword per group to see what keywords perform best.

7) Run placement reports regularly to see where ads showed up. Cut the low CTR and low conversion rate sites.

8) Use negatives liberally, but be careful not to misapply them in the wrong context. Should the negative be at an ad group or campaign level?
– Several unique keyword research tips there, as well as bidding advice. “Don’t be a wimp!” [regarding bids].
 – His last point on pruning and reviewing sites from the placement report is juicy. – Clever point on bid management for content towards the end, regarding placement targeting (aka site match). – Section on multimedia ads and what niches they work for makes sense.
 – Couple of tool links, including a free one, w0000t! – Case studies I found around

This wouldn’t be a valuable bit on content network ads without linking to Dave Szetela’s articlesfree book etc.

I also found this SEW article useful for having some nice tool links at the end, though I haven’t tested or compared vs the free one listed by Tommie.

Lastly, I’m currently trying out PPC Blog’s private community and getting some great, fast responses.

If you liked this post on PPC, add my rss feed to your reader for more!

3 New Uses For

Besides cloning expired sites (being sure to buy the rights to stay legal, of course; hat tip Stephan Spencer),‘s Wayback Machine has plenty of uses. Here are some I’ve Wayback Machine

1) Has the ownership of the site changed? Whois is not the be-all-end-all. If Google can destroy sites’ SEO based on radical changes in content, well, perhaps you’d be best advised to know whether that’s happened before based on large differences in content.

Why do you care? Because past history will affect the site’s trust in the SE algos. Slapped before, it’s probably more likely to get nailed again in the future. Otherwise, your leash is a little longer.

On a related note, my subscription to the premium drops tool (I was testing it out, hat tip to Todd Malicoat, if memory serves) just renewed automatically. I hate it when that happens. So: If you want to research any dropping domains/make some purchases and you’re not ready to buy the service yet, let me know what data you want and I’ll get it for you, free – don’t spend your money. If you’re a serious player, you might want to consider my website buying services instead though.

2) What was the page’s old URL before we launched the new site? My online jewelry client recently relaunched with a new backend and while we work out the kinks, analytics still needs to be used to direct strategy. Knowing what the former URL for the checkout, category or product detail page was can come in handy!

3) Was this test already run before? If you keep the same landing page URLs, for example, you can look at the old ones before you signed a new client to avoid duplicating work that’s already done. Some organizations keep good records, others have people relying on the memory of an inhouse expert or agency. When one or the other goes … there’s no one to answer the question. Note: This works better if the page is indexed. Of course, you can do this for all tests, not just landing pages.

* Lesson to the wise: Don’t be reliant on one-two people to track what you’ve learned over time from testing. You paid for that knowledge and should have access to good records explaining:

  • What was done: e.g. 2 headlines and 3 hero pics for 6 total variations),
  • Why that test was run: e.g. to see whether emotional or logical appeals mattered most and whether beauty or strength or product features mattered most, visually
  • What the results were: combo x converted best
  • Hypotheses to explain the results: people buy pens based on price; people buy pens based on comfort grips, multi-color changers and fancy packaging (features)
  • Next test to verify those hypotheses.

Note: My friend Claude Malaison of Emergence Web suggests that internal blogs can do serve this purpose for ‘internal memoir / archive” for an organization in his chapter of “Why Businesses Should Blog” (“Pourquoi bloguer dans un contexte d’affaires”), a 10-chapter book with as many authors.

Also, the Internet Archive created Archive-It, an archiving service that can perform this function for you. Use their contact form to get a quote/get started. Sadly, their usability is not the greatest and there’s no clear call-to-action or obvious pricing page or anything you’d expect/want from a site trying to sell you something. Ironic, I know.

Update: From John Andrews, competitive webmaster extraordinaire, a tip that might be useful to those looking to avoid the competition telling how they’ve evolved or perhaps hide any debatable tactics they employed in the past (or even avoid ending up cloned in the future 😉 …):

“In robots.txt every single time:

User-agent: ia_archiver
Disallow: / ”

Like this ‘big ideas‘ post? Get my RSS feed already, sheesh!

Buying Text Links – Pre-Published vs. Post-Published

Author: Gab Goldenberg

This is a guest post by Brandon Hopkins, a freelance Fresno website designer who blogs at

The most common way to buy links is to find a site that shows up in Google’s index, then contact the owner asking them to add your link in exchange for monetary compensation. What is often not considered is that published pages don’t change very often.

If you have a page that was published 6 months ago and the only change that page has ever see was adding an < a href… tag, isn’t that a red flag? I know it would be to me! Since this is content that is already online, we’ll call this a post-published link purchase. So a better, but often overlooked way to buy links is in pre-published content. Buying links on pre-published pages takes a little extra effort because you have to contact the site owner/webmaster/blogger based on their past articles, but ask for a link in an upcoming article. Since this content hasn’t ever been seen by Google it won’t raise a [why-is-this-otherwise-stable-content-suddenly-changing-] red flag when the post has a link to your website. A few concerns with this method: 1. Posts with a single outgoing link are dangerous. Experienced link buyers know that you can quickly get in trouble if your link is the only outgoing link on a page.

Make sure the pre-published page has a few outgoing links in addition to yours.

2. Paid link outers. Just with any link purchase, you’ll encounter a few people who like to use Google’s feedback form to report your site.

You can easily get around this by not telling the site owner which site you own, and explain that you want a few links. Give them 3 links to include in the post. Your link, and two links to trusted authority sites that are not direct competition.

[Ed: You might also consider buying links “for” the competition and ask to have yours included as well to keep things under the radar.]

3. Time and labor. This is a labor-intense method. You have to suggest topics for a post that your link will fit into. For many bloggers/webmasters, they don’t know what they’re going to post about so you might have to coach them on appropriate topics that would match the anchor text for the links you’re buying.

The only way around this is to outsource the labor (which adds a hard cost) or to bypass the individual blogger and go somewhere that accepts most links (like a regional web directory or niche directories).

4. Poor post quality. Knowing that they’re writing the post for the purpose of linking to you could lead the blogger to write a lazy post. A poorly written post won’t earn many links so you’ll have to rely on the strength of the main domain to provide some juice to your post. This completely depends on who’s writing the post.

A good way around this is to write the post yourself and offer it as free content. You might be able to get around spending money this way. [Editor’s note: Brandon’s implying in this context that you pay the site owner and provide the post. Obviously it’s not a paid link if you just offer a guest article, like Brandon’s doing here.]

Here’s to staying under the radar and on top of the SERPs!

If you liked this post on buying text links, add my rss feed to your reader!

Bring Your Pages This Way, to the Supplemental Index Egress

Author: Gab Goldenberg

Exit From Novi Sad Fortress TunnelI’m often asked about Google’s Supplemental Index (SI) and how to get pages out it. Consider these post directions to the exit, the light at the end of Google’s dark tunnel, the “this way to the egress,” if you will. (Ok, so I like my hype, but I promise it’ll be worth it.)

If you’re impatient, click here to get the solution to getting your page/pages out of the Supplemental Index and skip the background info. I think you’re better off with a full understanding of the SI, but I’ll leave it up to you.

As Aaron Wall puts it in his excellent SEO Book, search engines are frequently developing new ways to save computing power. When you do things on a large scale and have billions of websites and web pages to work with, as Google does, getting your computers to run more efficiently will save you money. Lots of it. The Supplemental Index is a way for Google to save money by being more efficient with its computing.

To understand how to get your pages out of the Supplemental Index, you need to understand why they’re there to begin with. In other words, you need to know how the SI works. So let me explain.

When somebody performs a search with Google, what happens behind the scenes is something like this.

  1. Google looks at the keywords.
  2. It compares them to the pages in its index (think of a library’s index card system).
  3. Google determines which pages are most relevant and ranks them.
  4. Finally, the results are displayed to the searcher.

It’s incredible, but Google really does do all that in fractions of a second.

(This is a simplified explanation of Google’s processes, based on what Dan Thies wrote at SEO Fast Start. If anyone has the precise page’s link, I’d appreciate a comment pointing it out, as I’ve lost it. Wouldn’t want that page to end up in the Supplemental Index, now would I?)

The second part of the process gets exponentially more expensive for Google every day, as more sites and pages go online. Their computers need to run through more and more content. Google hit upon the idea of using a Supplemental Index as a shortcut in this process. By putting pages of lesser importance in the SI, Google has fewer pages to assess and rank at step two. (Then, only if Google feels its main index has too few pages to allow it to do a good job, will Google resort to the Supplemental Index. It supplements their main index when this one is thin for a particular topic.)

Amazon's Supplemental Index Results

Amazon also had Supplemental Index issues at one point, as Tamar Weinberg pointed out in this picture. It’s nothing to feel bad about. (Note that Google no longer labels pages in the supplemental index as such.)

So how does Google determine if a web site or web page is of lesser importance? The behemoth of search considers how likely a web surfer is to arrive at that page if they randomly click links in a never-ending browsing session. The more likely someone is to visit your page through links on other websites and on your own, the more important your page. This idea is the foundation of Google, better known as PageRank. See Danny Sullivan’s article on PageRank (which he seems very enthusiastic about ranking for the keyword “PageRank,” based upon his linking practices on Search Engine Land) for more info.

The solution to getting your web page (or web pages, as the case may be) out of the Supplemental Index is to get more links to them. These can be links from within your own website, but it’s usually better – particularly with newer, less established sites – to get links from other websites.

If your site is non-commercial, it generally shouldn’t be difficult to get other sites to link to your inner pages, increase your PageRank and help you out of the Supplemental Index. But if your site is commercial, people are generally less willing to link to your pages. For example, the average webmaster (who isn’t an affiliate) has no incentive to link to a product page.

(As an aside, it would make sense for Google to count affiliate links towards determining whether a page belongs in the SI or main index. This despite the fact that they might be considered paid, PageRank manipulating links – which Google despises – otherwise).

So what do you do if you’re a merchant? (I owe this upcoming generous tip to Lorisa, who runs a Montessori school materials site, featuring CD-Romsteacher tools and more.)

Ironically, you create more pages!

Just not on your own website.

The pages you’ll create will be what’s commonly referred to as “User-Generated Content.” For example, the photo-sharing site Flickr lets people upload pictures that then get an individual page (the picture at the top of this post comes from this Flickr user, as a matter of fact). Flickr’s users generate their content. And they link back to their own sites, or other pages they like, from their Flickr profiles, photo pages and so on.

A particular type of user-generated content site – which I detailed in my article The New Directory – is likely more useful for getting these links than other types of site. This is because the quality of the user-generated content is controlled by member ratings and reviews. While I again encourage you to read the article and learn more, I’m happy to provide you with the easy solution: a list of sites accepting user-generated content where you can get links to your site’s pages.

  • Squidoo is a perfect example of The New Directory.Gooruze
  • Gooruze is similar but focused on the internet marketing niche.
  • SEOmoz is like a second home to me. I’d be crazy not to point out their great user-generated blog Youmoz.
  • Still, in marketing, search marketing social media site Sphinn gives you a nice page – do see my profile and add me as a friend.
  • eBay gives users About Me pages.
  • Myspace is a classic and should need no explanation.
  • Ezine Articles, for what it’s worth, lets you have a profile page. I’m not sure Google cares much about Ezine Articles, though, because contrary to the New Directory, their quality control is nil.
  • Dofollow blogs allow you to leave comments where the author’s name links to a page of his/her choosing. These links are counted by Google, as opposed to most blogs, which use nofollow (I’ll implement dofollow once I choose the right plugin). Courtney Tuttle has a good list of Dofollow blogs.
  • WordPress lets users create blogs of their own on their site. (I also recommend using the WordPress software if you want your own full-featured blog with plugins such as dofollow. This blog runs on it.)
  • PR offers free press release distribution. I’d link to PR Web too, but they don’t offer a free service, so no link love. (Told you being commercial was tough!)
  • Search for “your product” and “social networking” on Google. Try related searches like “your industry” + “social networking,” “your industry” + “social media,” “your industry” + “forum(s),” “your product” + “file-sharing,” “your product” + “article distribution.” The sites you find will let you create a profile and links.
  • At the end of the day, networking is still crucial, as I told Marketing Sherpa (PDF) (see tip #27). If you can get bloggers and journalists writing about you, that’s going to mean better links, traffic and copycat links from bloggers who repeat what they read in the daily paper.

In the world of business, lots of figures are thrown around. There’s a whole school of thought in the world of investing that only looks at a company’s financial statements and decides whether or not to buy their stock based on those numbers. The most important factor of all, imho, is growth. Flower in GrowthGrowth photo by Artful Life

1. Two brothers lived in a city once. The first married rich and got a dowry of $1M. The second worked so that he quickly saw his annual income growing by $100,000 annually. The former gentleman, not having much idea of the difficulty in building up an income, began spending the fortune without having an income.

Two other brothers lived in another city. The older inherited the family business, which took in $100,000 annually. His brother, envious, built up his own competing business and scaled so that his enterprise grew at a rate of $100,000 annually.

Two more brothers lived in a third city. Each built up a business. The older brothers grew at a rate of $1M annually, and the younger’s grew at a rate of $2M annually.

Which brother was better off, in either situation?

In the first situation, the older brother ends up being in debt by $100K more each year, starting in year 11.

In the second situation, the younger brother’s business becomes increasingly successful while his brother stagnates. Eventually, the younger brother’s business begins chipping away at the older brother’s market share, due to having a bigger brand, broader selection, cheaper prices … you name it.

In the third situation, despite the older brother’s business showing positive growth, he was not keeping pace with the younger brother’s efforts. In the Fortune Bigger Brothers rankings, the older brother was slipping slowly back in the rankings behind his brother. Eventually, he might even end up in situation 2.

There are a few takeaways.

  • First, if you are just sitting on your money, you’re actually showing negative growth. In the bigger economic picture, this is known as inflation. In the world of independent webmasters, this is neglecting a site in the name of a so-called ‘passive income’ that is really just a short-sighted dwindling away of assets. This is buying a site and then omitting to flip the site.
  • Second, it’s merely a matter of time before others overtake your brand, top you in the SERPs, replicate or even steal your link sources.SEO books existed before Aaron Wall wrote his. Frederick Marckini, CEO of iProspect, wrote one of the earliest ones. How many of you are aware of that? How many of you have bought the book, or would buy it today from a used bookseller?
  • Finally, in the general economy, if your revenue isn’t growing the fastest, you face diminishing purchasing power. In turn, this can lead to others overtaking you, as in 2, above. This carries through to your industry.

New Coniferous Growth
Coniferous Growth photo by Fallsguyd.
Growth speed (or velocity for Scrabble players) is a key economic issue. And it’s a key point for an independent webmaster too.

Recently, Aaron and Giovanna Wall launched PPC Blog, featuring such quality content as Why Google AdWords Site Targeting Usually Fails. Their focus has been on the growing distribution (see also Lee’s excellent piece on developing your network), and here’s how you can tell.

1) They paid a premium for the domain name, which allows them to brand the site as PPC Blog, and thus get lots of links featuring that phrase. Between that, and the fact (per Matt Cutts) that Google favors exact match domains, they should easily rank for PPC blog. That phrase will convert visitors into RSS subscribers.

2) The site is unmonetized, meaning that it’s going to attract many more links than if it were. It’s a short term loss (e.g. they could put on AdSense, affiliate ads, or SEO Book banners) of revenue traded off for a boost in links and traffic that will turn into additional distribution. They’re growing readership first and foremost.

3) Obvious, prominent links from SEO Book aimed at sharing the readership over.

If you liked this post on the economics of growth and how to launch a site for growth, get my RSS feed.

Froogle 2.0: Google Declares War On Amazon

Author: Gab Goldenberg







by One Good Bumblebee
From the Seattle Times today (read in print version) comes a story headlined: “Google Says It Will Challenge Amazon On Electronic Books.” Loyal readers of mine would have known this was coming 7 months ago. Here are a few choice excerpts from my old post:

“Why Google is likely to come into conflict with the mega-retailers like Amazon and eBay[…]” 

“In turn, the margins for internet retailers will become thinner as Google progressively steals marketshare and people just use Google as their default product search. Call it Froogle 2.0.”

Actually, it appears from the Seattle Times story that I was wrong on the margins point because Google will allow publishers to set their own retail price, while apparently Amazon is already selling e-books at a slight loss. Presumably, Google will take a commission on each sale. But Google is trying to steal market share…

This fits in with what I was saying about Google’s affiliate network and how they are going to become the biggest affiliate around in the next little while.

“The organic search results for obviously commercial searches will be dominated by Google Base products. […] Once this happens, Google becomes the largest affiliate worldwide.”

Replace Google Base with Google Books – whose rapid rise to prominence I noted at SEO Book – and you get today’s announcement [Ed: It actually was in Monday’s paper, but I didn’t have time to edit this post].

This isn’t the first time I’ve been ahead of the mainstream SEO community.

Longtime readers will recall how, I picked up on Google’s new submarine crawling, 4 months before the announcement that they were querying site search forms.

Others will remember my presentation of how to use trusts to buy sites at SMX Advanced last year – which also came months after I shared the technique on this blog…

And my SMX Advanced presentation was based partly on a year-old post! Yet people loved it…

SEO Steve loves the idea of using a crawler on a sitemap.

SEO Steve loves the idea of using a crawler on a sitemap.So if you want to stay ahead of the pack and have a competitive advantage, you should add my RSS feed to your reader.