Google’s Matt Cutts has put out a new Webmaster Help video. This one is particularly interesting and nearly 8 minutes long – much longer than the norm. It goes fairly in depth about how Google crawls content and attempts to rank it based on relevancy. PageRank, you’ll find is still the key ingredient.
He starts off by talking about how far Google has come in terms of crawling. When Cutts started at Google, they were only crawling every three or four months.
“We basically take page rank as the primary determinant,” says Cutts. “And the more page rank you have– that is, the more people who link to you and the more reputable those people are– the more likely it is we’re going to discover your page relatively early in the crawl. In fact, you could imagine crawling in strict page rank order, and you’d get the CNNs of the world and The New York Times of the world and really very high page rank sites. And if you think about how things used to be, we used to crawl for 30 days. So we’d crawl for several weeks. And then we would index for about a week. And then we would push that data out. And that would take about a week.”
He continues on with the history lesson, talking about the Google Dance, Update Fritz and things, and eventually gets to the present.
“So at this point, we can get very, very fresh,” he says. “Any time we see updates, we can usually find them very quickly. And in the old days, you would have not just a main or a base index, but you could have what were called supplemental results, or the supplemental index. And that was something that we wouldn’t crawl and refresh quite as often. But it was a lot more documents. And so you could almost imagine having really fresh content, a layer of our main index, and then more documents that are not refreshed quite as often, but there’s a lot more of them.”
Google continues to emphasize freshness, as we’ve seen in the company’s monthly lists of algorithm changes the last several months.
“What you do then is you pass things around,” Cutts continues. “And you basically say, OK, I have crawled a large fraction of the web. And within that web you have, for example, one document. And indexing is basically taking things in word order. Well, let’s just work through an example. Suppose you say Katy Perry. In a document, Katy Perry appears right next to each other. But what you want in an index is which documents does the word Katy appear in, and which documents does the word Perry appear in? So you might say Katy appears in documents 1, and 2, and 89, and 555, and 789. And Perry might appear in documents number 2, and 8, and 73, and 555, and 1,000. And so the whole process of doing the index is reversing, so that instead of having the documents in word order, you have the words, and they have it in document order. So it’s, OK, these are all the documents that a word appears in.”
“Now when someone comes to Google and they type in Katy Perry, you want to say, OK, what documents might match Katy Perry?” he continues. “Well, document one has Katy, but it doesn’t have Perry. So it’s out. Document number two has both Katy and Perry, so that’s a possibility. Document eight has Perry but not Katy. 89 and 73 are out because they don’t have the right combination of words. 555 has both Katy and Perry. And then these two are also out. And so when someone comes to Google and they type in Chicken Little, Britney Spears, Matt Cutts, Katy Perry, whatever it is, we find the documents that we believe have those words, either on the page or maybe in back links, in anchor text pointing to that document.”
“Once you’ve done what’s called document selection, you try to figure out, how should you rank those?” he explains. “And that’s really tricky.We use page rank as well as over 200 other factors in our rankings to try to say, OK, maybe this document is really authoritative. It has a lot of reputation because it has a lot of page rank. But it only has the word Perry once. And it just happens to have the word Katy somewhere else on the page. Whereas here is a document that has the word Katy and Perry right next to each other, so there’s proximity. And it’s got a lot of reputation. It’s got a lot of links pointing to it.”
He doesn’t really talk about Search Plus Your World, which is clearly influencing how users see content a great deal these days. And while he does talk about freshness he doesn’t really talk about how that seems to drive rankings either. Freshness is great, as far as Google’s ability to quickly crawl, but sometimes, it feels like how fresh something is, is getting a little too much weight in Google. Sometimes the more relevant content is older, and I’ve seen plenty of SERPs that lean towards freshness, making it particularly hard to find specific things I’m looking for. What do you think?
“You want to find reputable documents that are also about what the user typed in,” continues Cutts in the video. “And that’s kind of the secret sauce, trying to figure out a way to combine those 200 different ranking signals in order to find the most relevant document. So at any given time, hundreds of millions of times a day, someone comes to Google. We try to find the closest data center to them.”
“They type in something like Katy Perry,” he says . “We send that query out to hundreds of different machines all at once, which look through their little tiny fraction of the web that we’ve indexed. And we find, OK, these are the documents that we think best match. All those machines return their matches. And we say, OK, what’s the creme de la creme? What’s the needle in the haystack? What’s the best page that matches this query across our entire index? And then we take that page and we try to show it with a useful snippet. So you show the key words in the context of the document. And you get it all back in under half a second.”
As Cutts notes in the intro to the video, he could talk for hours about all of this stuff. I’m sure you didn’t expect him to reveal Google’s 200 signals in the video, but it does provide an some interesting commentary from the inside on how Google is approaching ranking, even if it omits these signals as a whole.
read more : http://tinyurl.com/cheyt3e
Google calculates Quality Score every time a search is performed for one of your keywords. Google says the score can affect your ad auction eligibility, your keyword’s cost-per-click, your keyword’s first page bid estimate, your keyword’s top of page bid estimate and your ad position.
Google’s Tanmay Arora posted a big explanation of Google’s “Quality Score sauce” in the AdWords Community forum, offering a bit more perspective (hat tip to Barry Schwartz).
“First, the relevance of a keyword is not entirely determined by its presence on the landing page or the number of times it’s been mentioned on the landing page,” says Arora. “It’s not about how appropriate we find the keyword to the product/landing page but how appropriate the users find it. In other words, the number of users clicking on your ad when they search for that keyword.”
“Second, when we add fresh keywords, initially, they’re awarded a historical Quality Score based on their previous performance on Google.com,” says Arora. “And only once the keyword starts accruing statistics, the system then evaluates its Quality Score based on its recent performance. This doesn’t happen dynamically but is a gradual process.”
Arora talks about one more key ingredient: “We take into account the exact match CTR of the keyword, as it’s a better indicator of the effectiveness of the keyword. (The exact match CTR refers to the number of times the keyword has triggered an ad when the search term exactly matched the keyword.) For example, if our keyword ‘red shoes’ is in broad match, it triggers our ad even for search terms like ‘red shoe’, ‘formal shoes’, ‘horse shoe,’ etc. However, the exact match statistics point out exactly when the keyword ‘red shoes’ triggered our ad and was clicked on by the user when he searched for the exact search term ‘red shoes’.”
There’s plenty more to be said…
“Quality Score is an estimate of how relevant your ads, keywords, and landing page are to a person seeing your ad,” Google explains in its AdWords help center. “Having a high Quality Score means that our systems think your ad, keyword, and landing page are all relevant and useful to someone looking at your ad. Having a low Quality Score, on the other hand, means that your ads, keywords, and landing page probably aren’t as relevant and useful to someone looking at your ad.”
“Suppose Sam is looking for a pair of striped socks,” Google says. “And let’s say you own a website that specializes in socks. Wouldn’t it be great if Sam types ‘striped socks’ into Google search, sees your ad about striped socks, clicks your ad, and then lands on your web page where he buys some spiffy new striped socks? In this example, Sam searches and finds exactly what he’s looking for. That’s what we consider a great user experience, and that’s what can earn you a high Quality Score.”
oogle says it calculates quality score by looking at your keyword’s past clickthrough rate, your display URL’s past clickthrough rate, your account history (the overall CTR of all ads and keywords in your account), the quality of your landing page, your keyword/ad relevance, geograhpic performance and your ad’s performance on a site.
Google Chief Economist Hal Varian gives a good explanation of quality score in this video from 2 years ago:
In another help center article, Google discusses how to improve your ad quality by creating “very specific” ad groups, choosing your keywords carefully, including keywords in your ad text, creating simple, “enticing” ads, using strong calls-to-action, testing multiple ads, and regularly reviewing campaign performance.
article published by http://tinyurl.com/7uy44mm
Last week, millions of Americans flocked to buy a Mega Millions lottery ticket or ten, each hoping for their own personal miracle. This strategy worked for all of three people.
Yesterday, in their annual homage to the April 1st prankster’s holiday, Google announced the miracle of a new ad technology, the AdWords Click-to-Teleport Ad Extensions which enables searchers to transport themselves across time and space and “shortens the online-offline conversion funnel.” They pretended this was an April Fool’s joke, but I know what Google is capable of, and so I’ve put a call into my rep to get in on the beta.
Today, tomorrow and every working day that follows, (a.k.a. every day) PPC managers struggle to make miracles happen inside their PPC campaigns – to find the perfect keyword, to write the perfect ad, or to implement a new ad targeting feature that beat all previous campaign performance records.
In reality, of course, the ‘miracles’ we wring out of mature campaigns day by day are much more mundane. The gains we make are usually more of the “three yards and a cloud of dust” variety, born of hard work, cleverness and meticulous attention to details.
So, in the true spirit of praying for miracles while we rely on more probable and practical solutions, we’ll take a look at a few tips that can help almost any campaign gain yardage this week.
Yes, Use Ad Sitelinks
Implementing Sitelinks ad extensions is as close to a sure thing as you will find to a minor miracle in paid search campaign optimization. They are easy to implement and almost always going to improve your click-through rates.
Take a look at the very clever way that Scott’s uses Sitelinks. First, notice how they use the links in ads for their brand term, Miracle Gro®. These ads add much more depth and breadth to the basic ad text, which makes the ad more interesting to more people.
Notice, too, how much real estate this one ad covers at the top of the SERP. How could their click-through rate not be better on these ads?
Sitelinks work very well for non-brand terms, too. Take a look at how Scott’s uses different types of sitelinks for the non-branded search query “best grass seed”:
In this case, Scott’s includes clear and unambiguous links to landing pages with Tips, Videos & Articles, which is what information hungry consumers are probably looking for.
In the regular part of their text ad, Scott’s effectively promotes their brand authority and brand recall by using the word Scott’s four times in the ad.
Sitelinks can perform minor miracles on your CTRs, but they can still suffer from a surprising case of under-engineering which we hope some upcoming AdWords releases will address (speaking of minor miracles).
While Sitelinks can be a powerful CTR stimulant, there’s no way to know exactly what is working and why,
since Sitelinks are disconnected from most of the keyword and ad-level tracking and reporting you’ve come to rely on. Another glaring Sitelinks need is to have it available at the ad group level, not the campaign level where it is now controlled.
Add Punctuation, Fix Capitalization
Here’s a simple and proven miracle cure for your existing ads on any network. Take a good look at how your ads actually look online and fix them! You will be amazed at how many opportunities you’ve got sitting right in front of you.
For example, since last February, Google has been presenting longer ad headlines for ads in the top positions by either inserting your display URL or your first line of ad copy. Not as many advertisers as you would expect are taking advantage of this feature, even though this can dramatically alter the CTR performance of the ads.
One simple change to your ad text – adding a period at the end of ad description line 1, gives you control how your ad actually displays in the top spots.
When Google detects a period or exclamation mark at the end of your first description line, they will insert that first description line into the headline of your ad. In some cases, Google will insert the first description line even without punctuation, but the only way to ensure you have control is to include the proper punctuation yourself.
To see what your ads look like top positions, look at the Edit Ads screen within the AdWords online interface. This feature is currently only available in the online AdWords system, not in AdWords Editor.
Here’s how it works. Without proper punctuation, your ad will have probably have a standard headline when it appear in the top positions as shown below:
However, if you simply add a period (or exclamation mark, though that won’t show in the ad) at the end of the first copy line, your ad will display a longer headline, which will stand out more and probably grabs more clicks.
Once you’ve fixed the punctuation issue, you may see some additional optimization tests to run.
For example, you may consider swapping description line 2 with description line 1 to give a free shipping offer more prominence.
This is an additional ad copy test that you wouldn’t have even imagined unless you first took notice of the differences between top and side position ad displays. Best of all, you don’t have to write any new copy – just make the copy you have work more effectively with a simple bit of punctuation.
Update Your Microsoft adCenter Campaigns
Many advertisers who ported their campaigns over to adCenter a few years ago directly from AdWords had to deal with trade-offs in handling ad copy, negative keywords and keyword match type differences when they first launched their adCenter campaigns.
Since that time, however, Microsoft has made many significant changes to the way that ads and keywords fundamentally operate. If you have not been keeping up with these changes, then taking time to revisit your adCenter campaigns now is likely to have a very positive, and possibly miraculous, impact on your adCenter campaign performance.
For example, adCenter now allows 71 characters in text ads, which makes it possible to import in AdWords ads which use all 70 characters – without revision.
Microsoft has also modified the way it handles negative keywords and other match types, which makes it possible to use more advanced matching logic structures inside your campaigns. They have also announced that they are working on a broad match modifier implementation that will allow your broad match modifier keywords to be ported in directly from AdWords.
Microsoft has been rolling out adCenter enhancements at an impressive and accelerating pace and now is the time to take a fresh look at your adCenter ‘classic’ campaigns. We’ll cover a more comprehensive analysis what’s changed in adCenter in an upcoming column.
article posted by http://tinyurl.com/7v7wlw8
Google Research has released a new study looking at how often ad impressions are accompanied by associated organic results and how the incrementality of ad clicks vary with the rank of those results.
Google posts the following highlights on its Inside AdWords blog:
- 81% of ad impressions and 66% of ad clicks occur in the absence of an associated organic result on the first page of search results. All ad clicks in these situations are incremental.
- On average, for advertisers who appear in the top rank organic slot, 50% of ad clicks are incremental. This means that half of all ad clicks are not replaced by organic clicks when search ads are paused.
- For advertisers whose organic search results are in the 2nd to 4th position, 81% of ad clicks are incremental. For advertisers appearing in organic position of 5 or lower, 96% of ad clicks are incremental.
Google is careful to note in the study’s concluding remarks that results will vary for individual advertisers, and that advertisers with similar IAC (Incremental Ad Clicks) estimates may have very different organic rank for the terms in their ad campaign. Advertisers with similar organic rank may have very different IAC estimates.
Last week, I had the privilege of speaking at the inaugural World Information Architecture Day (WIAD) in Ann Arbor, Michigan on the topic of information architecture and search engine optimization (SEO).
Normally, I teach SEO professionals about information architecture: what it is and is not, how to determine the best IA for websites, and so forth. At this event, it was the other way around. I was educating, or perhaps re-educating, information architects about SEO.
Teaching SEO can be frustrating because one must deal with negative stereotypes (“snake-oil charlatans”) and erroneous, preconceived notions about SEO.
How many times are we faced with a prospect who thinks SEO is about sprinkling magic fairy dust on a website so that it ranks #1 in Google all of the time for every targeted keyword phrase?
Oh, apparently we have the magical ability to make this happen…last week.
To be perfectly honest, I often prefer to work with people who are completely ignorant about SEO so I don’t have to deal with the stereotypes, preconceived notions, and Google gullibility.
Nevertheless, I have to acknowledge that the stereotypes, SEO myths, and gullibility exist. Acknowledging and challenging the negative stereotype is par for the course.
I have said it before. And I will say keep repeating until the world grasps this fundamental SEO concept: SEO is optimizing a website for people who use search engines.
Like the term “website usability,” the term “search engine optimization” is easily misunderstood. People honestly make statements such as, “I am the user,” and “Optimize for the average searcher,” and “People use my website all of the time; therefore, it is user friendly.”
Usability is about task completion and involves the following items:
- Error prevention
- User satisfaction
It is easy for people to believe that search engine optimization is optimizing a website for search engines only. Too easy, I think.
In reality, SEO has always been about searchers and search engines. Ignoring one at the expense of the other is a mistake…a big mistake.
So how do we make people aware of what the SEO process really is? I posed this question to one of my clients. Here is his 2 cents:
“Even though staff learned about SEO responsibilities that were not directly a part of their jobs, at least they have an awareness about how their contributions can positively or negatively affect SEO. That awareness is invaluable.”
I believe his comments show great insight. Don’t expect everyone to know how to do SEO after a short presentation. Don’t expect everyone to instantly become an SEO expert after a few hours in a certification course. Expertise comes from knowledge and experience.
Nevertheless, I think it is reasonable to expect a fundamental awareness of SEO, knowing that SEO involves meeting the needs of both searchers and search engines. And also knowing that SEO is not the process of sprinkling magical pixie dust on a website.
I expect that fundamental awareness from anyone working on a website: designers, developers, usability professionals, user experience designers, writers, advertisers, information architects, and so forth.
That awareness is invaluable.
SEO Knowledge & Aptitude
Here is a proverbial tough pill to swallow: not everyone has the aptitude for SEO or different aspects of SEO.
Search engine optimization has a human element as well as a technical element. Some SEO professionals are gifted technical SEOs. This is the group to turn to for assistance in managing duplicate content.
Some SEO professionals are expert copywriters. Some SEO experts are skilled at usability testing and might be the group to turn to if a site has search engine traffic and low conversions. Some SEOs are knowledgeable about how people search. And some SEOs are knowledgeable about why people search.
I wouldn’t ask an search engine optimizer who specializes in copywriting to program redirects. Nor would I expect a developer/programmer to be skilled at information architecture and usability testing.
I expect SEO professionals to have more than awareness. I expect them to have aptitude and knowledge.
If an SEO professional does not have a specific SEO skill needed for a project, I expect that person to reach out to an SEO who does…without feeling threatened. SEO should be a group effort. Everyone is on the same team.
I know. I know…easier said than done. Stereotypes, myths, and misconceptions can be difficult to debunk. So what did I share with the audience of information architects?
Part of an SEO’s job is:
- Labeling website content so that it is easy to find (unique aboutness)
- Organizing website content so that it is easy to find
- Ensuring search engines have access to desired content
- Ensuring search engines don’t have access to undesirable content (or at least limiting access)
- Accommodating searchers’ navigational, informational, and transactional goals
Information architecture decisions can positively and negatively impact SEO on web search engines as well as site search engines. Information architects have a role in SEO. Have the awareness.
Even better? Have the knowledge to hire an SEO professional when one is needed. Have the knowledge and humility to recognize that you might not have the aptitude and talent for optimizing. Understand that SEO knowledge does not necessarily mean SEO aptitude. Understand your role in the optimization process. Be knowledgeable enough to recognize a “snake-oil charlatan.”
Information architecture guru Peter Morville wrote the following in the foreword of When Search Meets Web Usability:
“Shari Thurow is among the few specialists brave enough to jump the gap between search engine optimization and web usability. As a result, she has learned how and where to place stepping stones and build bridges. She can speak the language of link analysis and relevance ranking algorithms, while also understanding user psychology and information seeking behavior.”
Yep, I build bridges. But I cannot make anyone cross a bridge. Awareness is the first step. Take that first step, information architects. You won’t regret it.
Article author Shari Thurow
I’m a fan of SEO. Search engine optimization. Ah, yes. Just writing it makes me happy. It’s what I do. It’s my calling. I can’t think of anything in the past decade that has kept me more enthralled and engaged… besides my wife of course.
In the past few years, my interest in SEO has accelerated even more, as the SEO industry has expanded to include everything from social media and conversion optimization to reputation management and a myriad of specialized tools that seemingly measure every click ever made on the Internet.
To make things even more fun, we have gone from traditional SEO, which specifically looks at organic traffic, to inbound marketing, which involves driving free traffic from any and all possible sources. People can’t help but get mesmerized by ALL THE FREE TRAFFIC!
While I still wake up every day and get super-duper excited about optimizing my clients’ sites/media/businesses, I often find myself reminiscing about the old days of SEO.
For me, the past 10 years will be remembered as the Golden Age of SEO. It really was an amazing era. Basically, everyone was following a simple, straightforward list of SEO best practices.
For most of my e-commerce clients, the 7 Keys to SEO success were:
- Write optimized title tags
- Write optimized meta description tags
- Write optimized H1 tags (page headers)
- Optimize internal navigation (main nav, breadcrumbs, sidebar nav, footer links, etc.)
- Optimize and utilize any content unique to the site (blogs, videos, PDFs, forums, etc.)
- Grow/buy some links (and be smart about it!)
- Over-communicate with your client(s)
Don’t get me wrong: there were a lot more than 7 steps in an SEO campaign.
But from 2000-2009ish, if you followed those 7 steps, you were probably very successful at whitehat SEO for e-commerce websites. It worked like a charm.
I reversed negative year-over-year organic search sales trends for several massive brands. I had numerous clients that experienced 100%+ year-over-year growth in organic search traffic during the holidays. I could pretty much take any e-commerce website to higher levels of organic traffic and revenue. In fact, it was not atypical to see ROI in the area of 50:1.
But, alas, these SEO strategies were not unique to me. These methods worked for a lot of SEOs. Back in those days, everyone could win at SEO. It truly was a golden age. And in my opinion, Google was begging for better SEO for all websites.
After all, better SEO makes things easier for Google, and it helps them find and return more relevant results. Even Google was enjoying the growth of marketing directors pumping more and more money in SEO.
During the Golden Age of SEO, you could go to any number of search conferences and listen to some of the world’s best SEOs publicly brag about whitehat and grayhat methods that worked. Bloggers were also posting about whitehat and grayhat SEO methods that worked.
At the same time, everyone was handing out new methods for free, the team at Google Analytics was really pushing hard to make their product competitive with the most popular third-party tracking software, such as Omniture and Coremetrics (both of whom were also taking tracking and analytics to new heights of customization and granularity). It was a time when successful methods were being shared freely, and it was getting progressively easier to provide extremely detailed reporting to clients.
It was also a great time to be an SEO account manager at a search agency. I could write an SEO audit, work with the client’s in-house development team or third-party design agency, schedule all my recommendations into their IT calendar, and communicate with clients via highly-detailed reports. Clients knew that SEO took time to deliver results, so it was perfectly fine to make small gains each month and keep my clients satisfied and happy.
Over the course of the campaign, all of those small wins would add up to big results! And as I worked with the various teams, I effectively taught the basics of SEO to everyone I worked with. Sometimes people get bored with their jobs, so learning new stuff was fun for them. And I like to teach, so it worked out really well for both parties.
Sure there were some difficult aspects of being an SEO during this time. For example, it was a pain having to fight for resources from the development teams to get my recommendations implemented. Another tough issue was working with clients who did not have accurate organic search data. But all in all, basic SEO strategies worked, new SEO methods were freely shared, and search data was getting more comprehensive and easier to obtain (and it was getting more granular!).
And Then… The SEO Game Changed
In 2009 (or so), I began to notice SEO going underground. Remember those SEO conference bragging sessions I mentioned earlier, where the experts would share their secret strategies? Well, those started turning into Q&A sessions and site reviews. And remember all those bloggers who were writing about whitehat methods that worked for SEO? Well, those wells dried up…and some of them just started going full-on blackhat.
Also, the market was getting more and more saturated with SEOs. Almost all notable e-commerce websites were now doing SEO. At the same time, Google was changing universal search to include local results, news results, shopping results, video results, image results – all of these results were now taking up prime spots in the SERPs.
Then, social media really hit, and the next thing we knew, we were seeing real-time results in the SERPs. It was complete and total convergence of all things Internet.
Today, we are surfing an Internet where nearly everyone is doing SEO. When I look at the Top 30 results for pretty much any keyword, I see 30 websites that are doing SEO with some level of expertise. I see shopping results, news results, social results, image results, and social results that have all been optimized for that keyword. It’s very competitive out there.
Furthermore, companies that design platforms and content management systems have also embraced SEO. Most modern CMS and platform backends have areas dedicated to SEO settings and features, meaning that most new websites come with standard SEO best practices out of the box. This means that a lot of companies who are not actively “doing SEO” now have websites with basic SEO strategies in place.
Also, e-commerce VPs and marketing directors are more knowledgeable about SEO, as many of them have experience with SEO agencies. It’s not out-of-the-ordinary for new clients to bring up rel-canonical tags, rel-author tags, the open graph, and 301 redirect strategies – during our campaign kickoff calls. The times really have changed!
And how about today’s data? On one hand, we have lost some data. Yahoo shut down OpenSiteExplorer. Google Analytics is now withholding organic search keyword data from logged-in visitors. (see, 2011: The Year Google & Bing Took Away From SEOs & Publishers).
But on the other hand, there are so many great resources for SEO-related data: SEOmoz, SEMrush, MajesticSEO, RavenTools, AHREFs, AuthorityLabs…the list goes on and on. Sure, these are paid tools and services, but so what?
Data-based decision making makes the SEO world go ’round. So we’ve got to have data. Those services are very helpful for competitive analysis, link building, checking rankings in the SERPs, site monitoring, etc. – I would have loved to have all of these tools 10 years ago!
On the social media optimization side of things, there are social media analytics and monitoring services popping up left and right.
So here we are in 2012: Everyone is doing SEO. Every market is competitive. Google and Yahoo are taking data away from us while new SEO tools and softwares are launching every day. And for most SEOs, the role of an SEO has expanded to that of an inbound marketer. It’s not enough to only know about strategies that drive traffic from search engines.
Nowadays, if you want to compete as an SEO expert, you need to stay up-to-date with local SEO, mobile SEO, social SEO, and anything new on the horizon.
Further, the economy of the past 4-5 years has changed the demands of clients. Showing small wins every month is not enough. Cash-strapped clients are cutting marketing budgets, and many of them are all looking at SEO as a way to get the best bang for the buck. And they want awesome results in the first month. As an SEO in 2012, I’ve got to deliver big results – yesterday!
It really is a fascinating time to be in the SEO business. There have never been so many different ways to drive free traffic to your websites. Perhaps I misstated before.
Perhaps *this* is the Golden Age of SEO. We’re smarter as marketers. Our clients are smarter with their budgets. We’ve been around the block a few times. We know what works, and the Internet seems to be changing faster than ever before (Google SPYworld, Twitter Brand Pages, Pinterest, etc..). It’s challenging. It’s always required effort and a smart approach. It’s not easy. But SEO is still SEO. It’s about content. It’s about links. It’s about connectedness. It’s about data and, most importantly, about driving results.
Let’s enjoy this age!
Article Author Kerry Dean
Google today listed changes it made to its algorithm in January. As previously discussed, the biggest takeaway from that (at least in my opinion) was an increased focus on freshness through not only updates to the “Freshness Update,” but also through changes to universal search, which focus on the queries that deliver news results.
The company also addressed a recent Panda tweak:
High-quality sites algorithm improvements. [launch codenames “PPtl” and “Stitch”, project codename “Panda”] In 2011, we launched the Panda algorithm change, targeted at finding more high-quality sites. We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines. We also released a minor update to refresh the data for Panda.
Google actually confirmed that this happened last week. Google reportedly said that there were no additional signals or actual changes to the algorithm, which would explain why there wasn’t a whole lot of fuss made about it, compared to Panda updates of the past.
Of course, there has been much more fuss about Google’s introduction of Search Plus Your World last month, which many have complained about with regards to its impact on Google search results relevancy.
Followers of the Panda saga may find this somewhat ironic, given that Google has spent nearly an entire year releasing Panda updates with the goal of improving search quality. Obviously this is not a goal Google is publicly backing away from, but some are questioning whether they’re placing a little more priority on making Google+ successful.
Article Author Chris Crum
Why are Google and WordPress so much in love? Why is it the perfect platform for anyone who wants to learn search engine optimization? This week Alex Miranda, the Editor-in Chief of the social media press release company, PR Underground spoke at the Hudson Valley WordPress Meetup group. The topic of discussion was Google Freshness Update:
- Age of Content – Part of your freshness score is the age of content on your website. Google scores frequently updated content differently from content that does not change.
- Adding New Pages – Websites that add new pages at a higher rate may earn a higher freshness score than sites that add content less frequently.
- Changes to Important Content – Content above the fold is considered important.
- Ease of Use - If you know how to use email or a word program, chances are you will have no problem updating your content.
- Speed – You can change content on any part of your site on the fly. You can even do it via your mobile phone with the WordPress App.
- Creating Pages & Posts- Creating a page is extremely simple. Every time you create a page or post, WordPress creates a permalink for that page.
This works perfect with the Google Freshness Update. Because it is so easy to create new pages and update your content. Google is constantly indexing new pages and you always have fresh content. Thank you Google for this update. <3
Query Deserve Freshness:
- The Query Deserves Freshness algorithm favors fresh content over old content.
- QDF will rank hot trending queries ahead of all other search queries.
- News sites and blogs benefit most from this algorithm.
- Adding Fresh Content – Adding content is so easy that it’s addictive. Google loves fresh content and WordPress once again makes it easy to add.
- Plugins – There are many SEO plugins that will allow you to monitor your keywords and add proper title tags and meta descriptions. There are even plugins that will find keywords in your content for you. Three that come to mind are; All in One SEO, Rank Reporter and Scribe SEO.
- Blogging – This is a major reason why we love you Google. WordPress is all about blogging and more!!!
It is almost as if Google’s Matt Cutts said, “Hey boss let’s reward all those who use WordPress to blog”
Google Authorship Program:
- Distinguish and validate your content in search results.
- There are a couple of steps which you have to take for this to happen.
- Coding: On WordPress all you have to do is add this line of code, at the top of your release and it automatically creates your structure data in the correct format.
- Plugins – Even easier is adding the Allow REL= and HTML in Author Bios plugin. Then all you have to do is add the aforementioned line of code in your user bio and wait for Google to Authenticate. It’s that easy my friends!!
So you see my friends. It really looks like Google loves WordPress and has rewarded it’s users. This is huge for SEO beginners. There are many other factors I could point out. I think I will wait a bit and do some testing to see if I could beat my 5 minute time of ranking on top of Google with any keyword.
It’s a new year. Somebody needs to declare something dead.
Recently, I gave a presentation on Google’s Search+ in which I said ranking reports no longer matter because there is no such thing as consistent Google rankings.
What you see in the Google search results and what I see, for the exact same query, are likely to be two different sets of results, thanks to:
- Personalized search results based on our own search history
- Influenced search results based on our friends’ search histories
- Local search results
- Brand mentions in social media and on web pages
- Query deserves images, video, products, news or other types of search results
- Query deserves freshness
- Query deserves diversity
- Brand mentions on social media and the web
- Over 500 algorithm changes a year
In addition to all the other factors that Google rolls into its integrated universal search model, the link based uniform rankings as we once knew are long dead.
As Google Web search and Search+ evolves the results you and I see are more likely to be different. Useful ranking reports depend on consistency. Consistency is dead. We no longer have search engine rankings. We have search engine placements.
Okay, I am not really declaring ranking reports DOA. In fact, Google and Bing provide nice ranking reports in their respective webmaster tools.
On the plus side, these reports show counts for impressions and clicks, average rankings, and up or down movement. For a one-look rankings report, I like this format and wish I could pull the data with an API.
But I pull data from web analytics too, including:
Exact match visits by each target keyword
Phrase match visits by each target keyword
Number of unique visitors from organic or non-paid search
Number of keywords from organic or non-paid search
The problem with these webmaster tools and analytics reports is that they do not explain how different keywords appear in the search results or why without a lot of cross-referencing, checking short date ranges and looking at search results.
See for yourself. Search Google for coffee house:
Search http://www.google.com/search?q=coffee+house I am confident you will see local results with area cafes.
Now run this search: http://www.google.com/search?q=coffee+house&pws=0. Did your results change? The pws=0 turns-off personalized results. If you are signed into Google, you should also visit https://www.google.com/history/ and remove then pause your Web history. I run all my searches with history turned-off and pws=0.
Change the city to Troy, NY. Are any of the first page results the same?
Change the location to United States. Did the local listings and map disappear? Are you seeing different types of sites?
Differences in the search results because of a changing location are easy to understand even if they are not always easy to isolate. Some search marketing firms have even setup web proxies hosted in different cities to help them see the same results as people living in those communities.
The Social Network Impact
If your Google+ network or friends on other social media communities influence rankings, that’s more difficult to figure out.
If your next door neighbor links to a website from Google+ that may impact a handful of people, but what happens if AdAge or Lady Gaga mentions your brand or links to your site? What influence will that have and how do you find out?
One way is to look for activity on different social media sites.
- site:plus.google.com search engine land
- site:facebook.com search engine land
- site:twitter.com search engine land
- site:pinterest.com search engine land
- site:linkedin.com search engine land
Search for brands and keywords then click on the results to get a feel for the conversations. The more massive the conversation and the more influential the writers, the more impact social is likely to have on search engine rankings.
Ask yourself how people discuss brands and keywords. Which brands get mentioned and linked to alongside keywords and why? Understanding the conversation will help you to plan your participation and inform your content on your social media accounts and on your website. This type of research may even help you with link building.
Another place to look for social media influence is in your web analytics. If links in social media are affecting search rankings, it is also likely that people are clicking on those links to visit your site. Search for referrals from all of the major social media sites.
If you cannot get this information from your analytics package, use the server logs for the website.
This is soft research. Visit the pages people are arriving from. Look for clues, not so much causes or correlations. The larger takeaway is that the old days of just looking at ranking reports and backlinks are over.
As search engines incorporate social media factors and collective intelligence more deeply into their ranking algorithms, awareness and understanding of different conversations becomes just as important as who is linking to whom.
Now over to you. How are you responding to Search+ and personalized results? How do you look for influences on the rankings? How are you using this information?
Article Author :- Tom Schmitz
Google made a couple AdWords-related announcements advertisers will want to take note of.
For one, they will soon offer ad group level impression share metrics for the search and display networks.
Advertisers will be able to use this data to better identify high performing ad groups that aren’t capturing the majority of available impressions, Google says.
Within the next few weeks, advertisers will start seeing three new columns that can be added to the ad groups tab. These are: Impr. Share, Lost IS (Rank) and Exact Match IS. Impr. Share shows the percentage of impressions you received divided by the estimated number of impressions you were eligible to receive. Lost IS (Rank) shows the share of impressions lost due to your d Rank. Exact Match IS (for the search network only) shows the percentage of impressions you received for searches that exactly matched your keyword divided by the estimated number of exact match impressions you were eligible to receive.
Additionally, Google is also planning an algorithm update to provide more accurate campaign impression share metrics. Changes will include: refined campaign-level stats and once-a-day updates.
“Since we are improving our algorithms, we will update all campaign-level impression share metrics back to May 2011,” says Katie Miller of Google’s Inside AdWords crew. “As a result, you will no longer be able to see campaign-level historical impression share metrics before May 2011.”
“In order to calculate your impression share metrics with a greater degree of accuracy, we will update all impression share metrics once per day (approximately noon Pacific Time [GTM-8]),” she says. “As a result, the impression share data that you see will not reflect impression share for the current day, and may not include the previous day’s impression share as well (depending on what time of the day you run your report).”
These changes will start rolling out globally on January 30.
In a separate announcement, Google announced that it has rolled out two improvements to Automated Rules in AdWords: an increase to the rule limit to 100 (from 10) and the ability to undo changes made by a rule.
“In the event that a rule doesn’t make the changes that you expect, you can easily undo them by clicking the ‘Undo’ button in the Logs table,” says product marketing manager Andrew Truong. “We hope that undo can help you feel more comfortable experimenting with Automated Rules. You can try out new strategies for a few days and quickly return to the state in which you started if you’re not happy with the performance.”
Article Author:- Chris Crum