Formal writing is really frustrating because it requires you to dress up simple ideas in complete sentences, edit your work for grammar and spend an unholy amount of time writing what it would take you a few minutes to express verbally. When you come up with new ideas or discover new stuff as often as I do, that can get really frustrating.
So I’m hereby inaugurating what I hope will be a regular column here: Scratchpad (scratchpad picture courtesy of one eye fish). I’ll share my latest ideas, in a raw scratchpad type format and be paying even more attention than usual to your feedback. (The Post #88 reference was the pre-naming version of this post’s title and I found it quite appropriate to an informal column.)
For this first issue, I’ve got
- New uses for Google’s Keyword Tool External,
- Mined ideas from Google’s Press Days 06 and 07,
- Revelations of what the PPC arbitrageurs are doing to fool Google’s Quality Score algo,
- How to mess with your competitors’ analytics,
- Proof that Google Analytics is integrated into Google’s search algos, and how it’s buggy
- Discussion on the latest in Google Maps
Google’s “Keyword Tool External”
Related keywords’ relevance is useful to:
– find places to earn links from
– find places to buy links from
– target with your content; become an authority beyond your narrow niche
– mix into your site to appear natural
– write a landing page that will get a better quality score
Where this data comes from Co-occurrence data in Google’s index, probably. Also possible that some of it are derived from personalization/tracking individuals’ searches. This latter possibility would explain how travel-oriented keywords I was checking into for my dad returned “additional keywords you might like to consider” that were about entirely different countries!
Google Press Days 2006 and 2007
Nugget from Matt Cutts: Personalization is oriented towards queries where the intent is in doubt.
“Q: How does the use of personal information to make search better while respecting privacy?
A: Terms like jaguar or bass are overloaded (fishing vs. guitars). Starting from scratch every query would be a disadvantage. Personalized search (which is opt-in) is important and thereâ€™s still room to improve there.
“Sukhinder: Mobile in China is 350M users while the internet is more like 100M users, so itâ€™s an interesting, different market.” ED: So Google’s Chinese mobile engine is going to be the focus?
“Ooh, now heâ€™s talking about eval and how itâ€™s difficult to impossible to make a change that is 100% better on 100% of queries. And that itâ€™s hard to do eval really well. He mentions that we take quality wins in our algorithms when we see the chance. He also touches on how hard international queries are to do well (segmentation in Chinese, for example).
“Marissa is talking about the fundamentals of search: comprehensiveness, relevance, speed, and user experience.
Fast-forward to Google Press Day 2007
“Up to today we have relied on automation, but I believe the future will be a blend of both, combing the scale of automation and human intelligence.” Marissa Meyer – “She was one of the first 20 employees at Google and its first female engineer; she’s now vice president of search products and user experience.”
Ties into what I was saying about Google being broken. Now it’s algorithms and humans, eh? Sounds suspiciously like Yahoo… (*cough Rand: if Google’s copying Yahoo, what does that tell you? cough*)
“She also said Google is already experimenting with exploring data patterns in searches, so it can try and detect what a user meant to search for, even if they didn’t type it. The beginnings of what can be seen in suggested searches, when you misspell a search request, for example.” Probably why the additional keywords to consider returns things that are not grammatically related (i.e. conjugations, plurals, variations, misspellings).
So my question is: If a human-edited search engine manages to cover the short-tail, and spammers clog up the long-tail, where will that leave Google? “Oh, we’re the four-keyword-query search engine?” I can’t see that happening…
PPC Arbitrage: The Latest and Greatest in Content Scraping for AdWords Quality Score
PPC arbitrageurs are using scraped content from public domain books, in order to improve their quality score and drop their CPCs. This means that they can avoid using auto-gen content. However, there are duplication issues. See TravellersDream.org, for instance. Look for the name of any large Canadian city and they’re bidding on it with an ad generally relating to hotels. The content on the landing page is scraped from some Canadian travel guide. And if they can scrape it, so can…
- travel.yahoo.com/p-travelguide-191501843-montreal_vacations-i – LOOKEE! LOOKEE! (Ok, there’s an unlinked attribution at the bottom of the “more” page (i.e. the one you get to by clicking read more) … big deal.)
- www.vcu.edu/sitar/sitarnewsletter2-01C.PDF – I thought educational institutions were against plagiarism?
- www.alamo.ca/getMainContent.do?pageKey=car-rental-in-montreal – Well aren’t we in good company, huh?
What’s a lazy arbitrageur to do? Well, writing your own content or buying some original stuff might be a good first step. Having it integrate relevant keywords as per nugget 1, above, would be a second good step. Or – and this is extremely drastic, so only consider it as a last resort after every other possibility has been exhausted – provide real value. Scary, I know.
Mess With Your Competitors’ Analytics
One thing I’ve been doing to play with my competitors’ analytics is hiding my long-tail searches from them. When they pop up in a SERP, I copy-paste the URL from google and slap it into the address bar. This allows me to disguise my visit as direct navigation, as from a bookmark, when in fact it’s the result of a search.
Another use for this is in regards to PPC. If a competitor is advertising with broad match keywords, and that has them advertising on your brand name, don’t let them realize that they’re advertising on your name by clicking the ad! If you do, then they’ll target your brand name specifically (see Future Now’s ad on SEO ROI ). Search for their landing page by searching for the text of their ad, instead. Or try copy-pasting the destination URL. This is less likely to work though because most advertisers just use the URL part of the ad to add an extra keyword (keyword.domain.com or domain.com/keyword) for greater “relevance”.
On a related note, if you want to make it look to competitors like an organic keyword offers a poor ROI, you can do the opposite. Send bots to your competitor through a particular search query. When they don’t convert, they’ll figure that the keyword isn’t worth competing for. Just make sure that you don’t do this for a longtail query that they weren’t actively targeting, to begin with, or you might draw their attention to it, especially if they figure out what’s going on by checking the “traffic” ‘s originating IPs.
My Proof That Google Analytics Data Is Used in Google’s Algorithms!
A little while ago, I read that you could use Google Analytics to track site searches your visitors perform. I implemented the tip. The result of tracking these results with Google analytics is that my site search results pages are being indexed by Google and returned in the SERPs! I noticed this recently when searching Google for something I wrote about Facebook. I got a page from my site (which wasn’t what I wanted) and a page that looked like seoroi.com/index.php?s=mart
Authority spammers with thousands of indexed posts and articles everywhere are rejoicing as they code site search bots that will look through their sites for lucrative keywords. Try site:seoroi.com to see what I’m talking about. Currently, I’m seeing results like “Supposed | SEO ROI Services” and “Pagerank | SEO ROI Services” being returned, with URLs seoroi.com/index.php?s=supposed and seoroi.com/index.php?s=pagerank.
Discussion on Google Maps
I recently broke the news that Google Maps’ SERPs were being updated. Google has been integrating Earth Booker (a hotel booking engine that uses Google Earth) into a huge number of its Google Maps SERPs, as well as Google’s own Panoramio picture site and showing reviews directly in the SERPs, as opposed to requiring an extra click to see them. The post also shared a list of places where Google Maps gets its reviews from.
Now, I’ve found some new integration of Google Maps into Google universal SERPs, with up to 10 maps results (as opposed to just three, which had been the case earlier) shown. Please share your thoughts in the Sphinn discussion on Google’s latest Maps experiments – I’ll cite the best and most intelligent comments when I speak at SMX .
Come back tomorrow for “Be Nice to the Loner Kid Lest He Destroys Your Reputation” – a true, personal story with ties to reputation management, social media, and most importantly: motivation and influence. I guarantee that you’re going to love it!