Wednesday, March 29, 2017

Your Daily SEO Fix: Keywords, Concepts, Page Optimization, and Happy NAPs

Posted by FeliciaCrawford

Howdy, readers! We're back with our last round of videos for this go of the Daily SEO Fix series. To recap, here are the other topics we've covered previously:

Today we'll be delving into more keyword and concept research, quick wins for on-page optimization, and a neat way to stay abreast of duplicates and inaccuracies in your local listings. We use Moz Pro, the MozBar, and Moz Local in this week's fixes.


Fix #1: Grouping and analyzing keywords by label to judge how well you're targeting a concept

The idea of "concepts over keywords" has been around for a little while now, but tracking rankings for a concept isn't quite as straightforward as it is for keywords. In this fix, Kristina shows you how to label groups of keywords to track and sort their rankings in Moz Pro so you can easily see how you're ranking for grouped terms, chopping and analyzing the data as you see fit.


Fix #2: Adding alternate NAP details to uncover and clean up duplicate or inaccurate listings

If you work in local SEO, you know how important it is for listings to have an accurate NAP (name, address, phone number). When those details change for a business, it can wreak absolute havoc and confuse potential searchers. Jordan walks you through adding alternate NAP details in Moz Local to make sure you uncover and clean up old and/or duplicate listings, making closure requests a breeze. (This Whiteboard Friday is an excellent explanation of why that's really important; I like it so much that I link to it in the resources below, too. ;)

Remember, you can always use the free Check Listing tool to see how your local listings and NAP are popping up on search engines:

Is my NAP accurate?


Fix #3: Research keywords and concepts to fuel content suggestions — on the fly

You're already spying on your competitors' sites; you might as well do some keyword research at the same time, right? Chiaryn walks you through how to use MozBar to get keyword and content suggestions and discover how highly ranking competitor sites are using those terms. (Plus a cameo from Lettie Pickles, star of our 2015 Happy Holidays post!)


Fix #4: Discover whether your pages are well-optimized as you browse — then fix them with these suggestions

A fine accompaniment to your on-the-go keyword research is on-the-go on-page optimization. (Try saying that five times fast.) Janisha gives you the low-down on how to check whether a page is well-optimized for a keyword and identify which fixes you should make (and how to prioritize them) using the SEO tool bar.


Further reading & fond farewells

I've got a whole passel of links if you're interested in reading more educational content around these topics. And by "reading," I mean "watching," because I really stacked the deck with Whiteboard Fridays this time. Here you are:

And of course, if you need a better handle on all this SEO stuff and reading blog posts just doesn't cut the mustard, we now offer classes that cover all the essentials.

My sincere thanks to all of you tuning in to check out our Daily SEO Fix video series over the past couple of weeks — it's been fun writing to you and hearing from you in the comments! Be sure to keep those ideas and questions comin' — we're listening.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, March 17, 2017

Ranking Multiple Domains to Own More SERP Real Estate - Whiteboard Friday

Posted by randfish

Is it better to rank higher in a single position frequently, or to own more of the SERP real estate consistently? The answer may vary. In today's Whiteboard Friday, Rand presents four questions you should ask to determine whether this strategy could work for you, shares some high-profile success cases, and explores the best ways to go about ranking more than one site at a time.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about ranking multiple domains so you can own a bunch of the SERP real estate and whether you should do that, how you should do that, and some ways to do that.

I'll show you an example, because I think that will help kick us off. So you are almost certainly familiar, if you've played around in the world of real estate SERPs, with Zillow and Trulia. Zillow started up here in Seattle. They bought Trulia a couple of years ago and have been doing pretty amazingly well. In fact, I was speaking at a real estate conference in New York recently, and my God, I did an example where I was searching for tons of cities plus homes for sale or plus real estate or houses, and Zillow and Trulia, along with a couple others, are in the top five for every single city I checked no matter how big or small. So very, very impressive SEO.

One of the things that a lot of SEOs have seen, not just with Zillow and Trulia, but with a few others like them is that, man, they own multiple listings in the SERPs, and so they kind of dominate the real estate here and get even more clicks as an entity, a combined entity than they would if Zillow had, for example, when they bought Trulia, redirected Trulia.com to Zillow. On Whiteboard Friday and at Moz and a lot of people in the SEO world often recommend that when you buy another domain or when you're combining entities, that you do actually 301 redirect, because it can help bring up the rankings here.

The reason Zillow did not do that, and I think wisely so, is that they already dominated these SERPs so well that they figured pushing Trulia's rankings into their own and combining the two entities would, yes, probably move them from number two and three to number one in some places, but they already own number one in a ton of these. Trulia was almost always one or two or three. Why not own all of that? Why not own 66% of the top three consistently, rather than number one a little more frequently? I think that was probably the right move for them.

Questions to ask

As a result, many SEOs asked themselves, "Should I do something similar? Should I buy other domains, or should I start other domains? Should I run multiple sites and try and rank for many different keyword phrases or a few keywords that I care very, very deeply about?" The answer is, well, before you do that, before you make any call, ask yourself these four questions. The answers to them will help you determine whether you should follow in these footsteps.

1. Do I need to dominate multiple results for a keyword or set of keywords MORE than I need better global rankings or a larger set of keywords sending visits?

So first off, do you need to dominate multiple results for a keyword or a small set of keywords more than you need to improve global rankings? Global rankings, I mean like all the keywords that your site could rank for potentially or that you do rank for now or could help you to rank a larger set of keywords that send visits and traffic.

You kind of have to weigh these two things. It's either: Do I want two out of the top three results to be mine for this one keyword, or do I want these 10 keywords that I'm ranking for to broadly move up in rankings generally?
A lot of the time, this will bias you to go, "Wait a minute, no, the opportunity is not in these few keywords where I could dominate multiple positions. It's in moving up the global rankings and making my ability to rank for any set of keywords greater."

Even at Moz today, Moz does very well in the rankings for a lot of terms around SEO. But if, for example, let's say we were purchased by Search Engine Land or we bought Search Engine Land. If those two entities were combined, and granted, we do rank for many, many similar keywords, but we would probably not keep them separate. We would probably combine them, because the opportunity is still greater in combination than it is in dominating multiple results the way Zillow and Trulia are. This is a pretty rare circumstance.

2. Will I cannibalize link equity opportunities with multiple sites? Can I get enough link equity & authority signals to rank both?

Second, are you going to cannibalize link equity opportunities with multiple sites, and do you have the ability to get enough equity and authority signals to rank both domains or all three or all four or whatever it is?

A challenge that many SEOs encounter is that building links and building up the authority to rank is actually the toughest part of the SEO equation. The keyword targeting and ranking multiple domains, that's nice to have, but first you've got to build up a site that's got enough link equity. If it is challenging to earn links, maybe the answer is, hey, we should combine all our efforts or we should on work on all our efforts. Remember, even though Zillow owns Trulia, Trulia and Zillow are one entity, the links between them don't help the other one rank very much. It was already a case, before Zillow bought them, that Trulia and Zillow independently ranked. The two sites offer different experiences and some different listings and all that kind of stuff.

There are reasons why Google keeps them separately and why Zillow and Trulia keep them separately. But that's going to be really tough. If you're a smaller business or a smaller website starting out, you're trying to decide where should you put your link equity efforts, it might lean a little more this way.

3. Should I use my own domain(s), should I buy an existing site that ranks, or should I do barnacle SEO?

Number three. Should you use your own domain if you decide that you need to have multiple domains ranking for a single keyword? A good example of this case scenario is reputation management for your own brand name or for maybe someone who works at your company, some particular product that you make, whatever it is, or you're very, very focused and you know, "Hey, this one keyword matters more than everything else that we do."

Okay. Now the question would be: Should you use your own domain or a new domain that you buy and register and start building up? Should you buy an existing domain, something that already ranks, or should you do barnacle SEO? So mysite2.com, that would be basically you're registering a new domain, you're building it up from scratch, you're growing that brand, and you're trying to build all the signals that you'll need.

You could buy a competitor that's already ranking in the search results, that already has equity and ranking ability. Or you could say, "Hey, we see that this Quora question is doing really well. Can we answer that question tremendously well?" Or, "We see that Medium can perform tremendously well here. You know what? We can write great posts on Medium." "We see that LinkedIn does really well in this sector. Great. We can do some publishing on LinkedIn." Or, "There's a list of companies on this page. We can make sure that we're the number-one listed company on that page." Okay. That kind of barnacle SEO, we did a Whiteboard Friday about that a few months ago, and you can check that out too.

4. Will my multi-domain strategy cost time/money that would be better spent on boosting my primary site's marketing? Will those efforts cause brand dilution or sacrifice potential brand equity?

And number four, last but not least, will your multi-site domain strategy cost you time and money that would be better spent on boosting your primary site's marketing efforts? It is the case that you're going to sacrifice something if you're putting effort into a different website versus putting all your marketing efforts into one domain.

Now, one reason that people certainly do this is because they're trying riskier tactics with the second site. Another reason is because they've already dominated the rankings as much as they want, or because they're trying to build up multiple properties so that they can sell one off. They're very, very good at link building this space already and growing equity and those sorts of things.

But the other question you have to ask is: Will this cause brand dilution? Or is it going to sacrifice potential brand equity? One of the things that we've observed in the SEO world is that rankings alone do not make for revenue. It is absolutely the case that people are choosing which domains to click on and which domains to buy from and convert on based on the brand and their brand familiarity. When you're building up a second site, you've got to be building up a second brand. So that's an additional cost and effort.

Now, I don't want to rain on the entire parade here. Like we've said in a few of these, there are reasons why you might want to consider multiple domains and reasons why a multi-domain strategy can be effective for some folks. It's just that I think it might be a little less often and should be undertaken with more care and attention to detail and to all these questions than what some folks might be doing when they buy a bunch of domains and hope that they can just dominate the top 10 right out of the gate.

All right, everyone, look forward to your thoughts on multi-domain strategies, and we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thursday, March 16, 2017

Giving Away the Farm: Proposal Development for New SEO Agencies

Posted by BrianChilds

There's a huge difference between making money from selling SEO and actually making a living — or making a difference, for that matter. A new marketing agency will quickly discover that surviving on $1,000 contracts is challenging. It takes time to learn the client and their customers, and poorly written contracts can lead to scope creep and dissatisfied clients.

It's common for agencies to look for ways to streamline operations to assist with scaling their business, but one area you don't want to streamline is the proposal research process. I actually suggest going in the opposite direction: create proposals that give away the farm.

Details matter, both to you and your prospective client

I know what you’re thinking: Wait a minute! I don’t want to do a bunch of work for free!

I too am really sensitive to the idea that a prospective client may attempt to be exploitative. I think it's a risk worth taking. Outlining the exact scope of services forces you to do in-depth research on your prospect’s website and business, to describe in detail what you're going to deliver. Finding tools and processes to scale the research process is great, but don’t skip it. Detailing your findings builds trust, establishes your team as a high-quality service provider, and will likely make you stand out amongst a landscape of standard-language proposals.

Be exceptional. Here's why I think this is particularly important for the proposal development process.

Avoid scope creep & unrealistic expectations

Just like the entrepreneur that doesn’t want to tell anyone their amazing idea without first obtaining an NDA, new SEO agencies may be inclined to obscure their deliverables in standard proposal language out of fear that their prospect will take their analysis and run. Generic proposal language is sometimes also used to reduce the time and effort involved in getting the contract out the door.

This may result in two unintended outcomes:

  1. Lack of specific deliverables can lead to contract scope creep.
  2. It can make you lazy and you end up walking into a minefield.

Companies that are willing to invest larger sums of money in SEO tend to have higher expectations, and this cuts both ways. Putting in the work to craft a detailed proposal not only shows that you actually care about their business, but it also helps manage the contract's inevitable growth when you're successful.

Misalignment of goals or timelines can sour a relationship quickly. Churn in your contracts is inevitable, but it's much easier to increase your annual revenue by retaining a client for a few more months than trying to go out and find a replacement. Monetizing your work effectively and setting expectations is an excellent way to make sure the relationship is built on firm ground.

Trust is key

Trust is foundational to SEO: building trustworthy sites, creating valuable and trustworthy content, becoming a trusted resource for your community that's worth linking to. Google rewards this kind of intent.

Trust is an ethos; as an SEO, you're a trust champion. You can build trust with a prospect by being transparent and providing overwhelming value in your proposal. Tell your clients exactly what they need to do based on what you discover in your research.

This approach also greases the skids a little when approaching the prospect for the first time. Imagine the difference between a first touch with your prospect when you request a chance to discuss research you’ve compiled, versus a call to simply talk about general SEO value. By developing an approach that feels less like a sales process, you can navigate around the psychological tripwires that make people put up barriers or question your trustworthiness.

This is also referred to as "consultative sales." Some best practices that business owners typically respond well to are:

  • Competitive research. A common question businesses will ask about SEO relates to keywords: What are my competitors ranking for? What keywords have they optimized their homepage for? One thing I like to do is plug the industry leader’s website into Open Site Explorer and show what content is generating the most links. Exporting the Top Pages report from OSE makes for a great leave-behind.
  • Top questions people are asking. Research forum questions that relate to the industry or products your prospect sells. When people ask questions on Yahoo Answers or Quora, they're often doing so because they can’t find a good answer using search. A couple of screenshots can spark a discussion around how your prospective client’s site can add value to those online discussions.

Yes, by creating a more detailed proposal you do run the risk that your target company will walk away with the analysis. But if you suspect that the company is untrustworthy, then I'd advise walking away before even building the analysis in the first place; just try getting paid on time from an untrustworthy company.

Insights can be worth more

By creating a very transparent, "give away the farm"-type document, SEOs empower themselves to have important discussions prior to signing a contract. Things like:

  • What are the business goals this company wants to focus on?
  • Who are the people they want to attract?
  • What products or pages are they focused on?

You’ll have to understand at least this much to set up appropriate targeting, so all the better to document this stuff beforehand. And remember, having these conversations is also an investment in your prospect’s time — and there's some psychology around getting your target company to invest in you. It's called "advancement" of the sale. By getting your prospect to agree to a small, clearly defined commitment, it pulls them further down the sales funnel.

In the case of research, you may choose to ask the client for permission to conduct further research and report on it at a specified time in the future. You can use this as an opportunity to anchor a price for what that research would cost, which frames the scope of service prices later on.

By giving away the farm, you'll start off the relationship as a trusted advisor. And even if you don’t get the job to do the SEO work itself, it's possible you can develop a retainer where you help your prospect manage digital marketing generally.

Prepping the farm for sale

It goes without saying, but making money from SEO requires having the right tools for the job. If you're brand-new to the craft, I suggest practicing by auditing a small site. (Try using the site audit template we provide in the site audit bootcamp.) Get comfortable with the tools, imagine what you would prioritize, and maybe even do some free work for a site to test out how long it takes to complete relatively small tasks.

Imagine you were going to approach that website and suggest changes. Ask yourself:

  • Who are they selling to?
  • What keywords and resources does this target user value?
  • What changes would you make that would improve search rank position for those terms?
  • What would you do first?
  • How long would it take? (In real human time, not starving-artist-who-never-sleeps time.)

Some of the tools that I find most helpful are:

  • Moz Pro Campaigns > Custom Reports. This is an easy one. Create a Moz Pro campaign (campaigns are projects that analyze the SEO performance of a website over time) and then select “Custom Reports” in the top-right of the Campaign interface. Select the modules you want to include — site crawl and keyword rankings against potential competitors are good ones — and then offer to send this report to your prospect for free. It's a lot harder for a customer to turn something off than it is to turn something on. Give away a custom report and then set up time to talk through the results on a weekly basis.
  • Builtwith.com. This free service allows you to investigate a number of attributes related to a website, including the marketing software installed. Similar to a WHOIS search, I use this to understand whether the prospect is overloaded with software or if they completely lack any marketing automation. This can be helpful for suggesting tools that will improve their insights immediately. Who better to help them implement those tools or provide a discount than you?
  • Keyword Explorer > Lists. Create a list in Keyword Explorer and look for the prevalence of SERP features. This can tell you a lot about what kinds of content are valuable to their potential visitor. Do images show up a lot? What about videos? These could be opportunities for your customer.
  • MozBar. Use the Page Analysis tab in MozBar to assess some of the website’s most important pages. Check page load speed in the General Attributes section. Also see if they have enticing titles and descriptions.
  • Site crawl. If you don’t have Moz Pro, I recommend downloading Screaming Frog. It can crawl up to 500 pages on a site for free and then allow you to export the results into a .csv file. Look for anything that could be blocking traffic to the site or reducing the chance that pages are getting indexed, such as 4XX series errors or an overly complex robots.txt file. Remedying these can be quick wins that provide a lot of value. If you start a Moz Pro campaign, you can see how these issues are reduced over time.

Want to learn how to add SEO to your existing portfolio of marketing services?

Starting on April 4th, 2017, Moz is offering a 3-day training seminar on How to Add SEO to Your Agency. This class will be every Tuesday for 3 weeks and will cover some of the essentials for successfully bringing SEO into your portfolio.

Sign up for the seminar!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Your Daily SEO Fix: Link Building & Ranking Zero

Posted by FeliciaCrawford

Last week, we shared a series of four short videos intended to help you accomplish some easy wins using Moz Pro: Your Daily SEO Fix: The Keyword Research Edition. Week Two (that's this week!) is focused on link building, identifying opportunities to take over SERP features, and doing that all-important competitive research.

This time around, we're using a mix of Open Site Explorer, Fresh Web Explorer, and Moz Pro. Open Site Explorer has some free capabilities, so if you'd like to follow along...

Open OSE in a new tab!

If you're a Moz Pro subscriber, crack open your campaigns and settle in. If you'd like to see what all the fuss is about without committing, you can dip your toes in with a free 30-day trial. And now that that's out of the way, let's get started!


Fix #1: Link building & brand building via unlinked mentions

"Moz" is an SEO software company, yes, but it's also Morissey's nickname and short for "Mozambique." All three of those things get mentioned around the web a bunch on any given day, but if we want to identify link building opportunities just to our site, it could get confusing quick. Luckily, Jordan's here to explain how to quickly find unlinked mentions of your site or brand using Open Site Explorer and keep those pesky Smiths references out of your results.


Fix #2: Prioritizing and organizing your link building efforts

Link building requires more than just finding opportunities, of course. April shows how you can prioritize your efforts by identifying the most valuable linking opportunities in Open Site Explorer, then dives into how you can cultivate a continuous stream of fresh related content ripe for a link-back with Fresh Web Explorer.


Fix #3: Ranking in position zero with SERP features in Moz Pro

If you have keywords that aren't ranking in the first few results pages, don't despair — there's hope yet. There are tons of opportunities to rank above the first organic result with the prevalence of SERP features. In this video, Ellie shows how you can identify keywords that need some love, track SERP feature opportunities for them, filter your keywords to show only those that surface certain SERP features, and more.


Fix #4: Gleaning insights from your competitors' backlink profiles

Remember April from Fix #2? She's back and ready to show you how to get the skinny on your competitors' juicy backlink profiles using both your Moz Pro campaign and Open Site Explorer.


One step beyond

That wraps up our latest week of fixes! We've got one last round coming at you next Thursday. As always, if you're curious and want to follow along, you can try it all out firsthand by taking a free trial of Moz Pro. We also offer several SEO bootcamp courses that can get you started on fundamentals if this whole SEO thing is pretty new to you.

If you're looking for some more meaty info on these topics, I've put together a short list of light reading for you:

Thanks for reading along, friends, and we'll see you again for the last installment of the Daily SEO Fix series next week!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, March 14, 2017

The State of Searcher Behavior Revealed Through 23 Remarkable Statistics

Posted by randfish

One of the marketing world's greatest frustrations has long been the lack of data from Google and other search engines about the behavior of users on their platforms. Occasionally, Google will divulge a nugget of bland, hard-to-interpret information about how they process more than X billion queries, or how many videos were uploaded to YouTube, or how many people have found travel information on Google in the last year. But these numbers aren't specific enough, well-sourced enough, nor do they provide enough detail to be truly useful for all the applications we have.

Marketers need to know things like: How many searches happen each month across various platforms? Is Google losing market share to Amazon? Are people really starting more searches on YouTube than Bing? Is Google Images more or less popular than Google News? What percent of queries are phrased as questions? How many words are in the average query? Is it more or less on mobile?

These kinds of specifics help us know where to put our efforts, how to sell our managers, teams, and clients on SEO investments, and, when we have this data over time, we can truly understand how this industry that shapes our livelihoods is changing. Until now, this data has been somewhere between hard and impossible to estimate. But, thanks to clickstream data providers like Jumpshot (which helps power Moz's Keyword Explorer and many of our keyword-based metrics in Pro), we can get around Google's secrecy and see the data for ourselves!

Over the last 6 months, Russ Jones and I have been working with Jumpshot's Randy Antin, who's been absolutely amazing — answering our questions late at night, digging in with his team to get the numbers, and patiently waiting while Russ runs fancy T-Distributions on large datasets to make sure our estimates are as accurate as possible. If you need clickstream data of any kind, I can't recommend them enough.

If you're wondering, "Wait... I think I know what clickstream data is, but you should probably tell me, Rand, just so I know that you know," OK. :-) Clickstream monitoring means Jumpshot (and other companies like them — SimilarWeb, Clickstre.am, etc.) have software on the device that records all the pages visited in a browser session. They anonymize and aggregate this data (don't worry, your searches and visits are not tied to you or to your device), then make parts of it available for research or use in products or through APIs. They're not crawling Google or any other sites, but rather seeing the precise behavior of devices as people use them to surf or search the Internet.

Clickstream data is awesomely powerful, but when it comes to estimating searcher behavior, we need scale. Thankfully, Jumpshot can deliver here, too. Their US panel of Internet users is in the millions (they don't disclose exact size, but it's between 2–10) so we can trust these numbers to reliably paint a representative picture. That said, there may still be biases in the data — it could be that certain demographics of Internet users are more or less likely to be in Jumpshot's panel, their mobile data is limited to Android (no iOS), and we know that some alternative kinds of searches aren't captured by their methodology**. Still, there's amazing stuff here, and it's vastly more than we've been able to get any other way, so let's dive in.

23 Search Behavior Stats

Methodology: All of the data was collected from Jumpshot's multi-million user panel in October 2016. T-distribution scaling was applied to validate the estimates of overall searches across platforms. All other data is expressed as percentages. Jumpshot's panel includes mobile and desktop devices in similar proportions, though no devices are iOS, so users on Macs, iPhones, and iPads are not included.

#1: How many searches are *really* performed on Google.com each month?

On the devices and types of queries Jumpshot can analyze, there were an average of 3.4 searches/day/searcher. Using the T-Distribution scaling analysis on various sample set sizes of Jumpshot's data, Russ estimated that the most likely reality is that between 40–60 billion searches happen on Google.com in the US each month.

Here's more detail from Russ himself:

"...All of the graphs are non-linear in shape, which indicates that as the samples get bigger we are approaching correct numbers but not in a simple % relationship... I have given 3 variations based on the estimated number of searches you think happen in the US annually. I have seen wildly different estimates from 20 billion to 100 billion, so I gave a couple of options. My gut is to go with the 40 billion numbers, especially since once we reach the 100MM line for 40 and 60B, there is little to no increase for 1 billion keywords, which would indicate we have reached a point where each new keyword is searched just 1 time."

How does that compare to numbers Google's given? Well, in May of 2016, Google told Search Engine Land they "processed at least 2 trillion searches per year." Using our Jumpshot-based estimates, and assuming October of 2016 was a reasonably average month for search demand, we'd get to 480–720 billion annual searches. That's less than half of what Google claims, but Google's number is WORLDWIDE! Jumpshot's data here is only for the US. This suggests that, as Danny Sullivan pointed out in the SELand article, Google could well be handling much, much more than 2 trillion annual searches.

Note that we believe our 40–60 billion/month number is actually too low. Why? Voice searches, searches in the Google app and Google Home, higher search use on iOS (all four of which Jumpshot can't measure), October could be a lower-than-average month, some kinds of search partnerships, and automated searches that aren't coming from human beings on their devices could all mean our numbers are undercounting Google's actual US search traffic. In the future, we'll be able to measure interesting things like growth or shrinkage of search demand as we compare October 2016 vs other months.

#2: How long is the average Google search session?

Form the time of the initial query to the loading of the search results page and the selection of any results, plus any back button clicks to those SERPs and selection of new results, the all-in average was just under 1 minute. If that seems long, remember that some search sessions may be upwards of an hour (like when I research all the best ryokans in Japan before planning a trip — I probably clicked 7 pages deep into the SERPs and opened 30 or more individual pages). Those long sessions are dragging up that average.

#3: What percent of users perform one or more searches on a given day?

This one blew my mind! Of the millions of active, US web users Jumpshot monitored in October 2016, only 15% performed at least one or more searches in a day. 45% performed at least one query in a week, and 68% performed one or more queries that month. To me, that says there's still a massive amount of search growth opportunity for Google. If they can make people more addicted to and more reliant on search, as well as shape the flow of information and the needs of people toward search engines, they are likely to have a lot more room to expand searches/searcher.

#4: What percent of Google searches result in a click?

Google is answering a lot of queries themselves. From searches like "Seattle Weather," to more complicated ones like "books by Kurt Vonnegut" or "how to remove raspberry stains?", Google is trying to save you that click — and it looks like they're succeeding.

66% of distinct search queries resulted in one or more clicks on Google's results. That means 34% of searches get no clicks at all. If we look at all search queries (not just distinct ones), those numbers shift to a straight 60%/40% split. I wouldn't be surprised to find that over time, we get closer and closer to Google solving half of search queries without a click. BTW — this is the all-in average, but I've broken down clicks vs. no-clicks on mobile vs. desktop in #19 below.

#5: What percent of clicks on Google search results go to AdWords/paid listings?

It's less than I thought, but perhaps not surprising given how aggressive Google's had to be with ad subtlety over the last few years. Of distinct search queries in Google, only 3.4% resulted in a click on an AdWords (paid) ad. If we expand that to all search queries, the number drops to 2.6%. Google's making a massive amount of money on a small fraction of the searches that come into their engine. No wonder they need to get creative (or, perhaps more accurately, sneaky) with hiding the ad indicator in the SERPs.

#6: What percent of clicks on Google search results go to Maps/local listings?

This is not measuring searches and clicks that start directly from maps.google.com or from the Google Maps app on a mobile device. We're talking here only about Google.com searches that result in a click on Google Maps. That number is 0.9% of Google search clicks, just under 1 in 100. We know from MozCast that local packs show up in ~15% of queries (though that may be biased by MozCast's keyword corpus).

#7: What percent of clicks on Google search results go to links in the Knowledge Graph?

Knowledge panels are hugely popular in Google's results — they show up in ~38% of MozCast's dataset. But they're not nearly as popular for search click activity, earning only ~0.5% of clicks.

I'm not totally surprised by that. Knowledge panels are, IMO, more about providing quick answers and details to searchers than they are about drawing the click themselves. If you see Knowledge Panels in your SERPs, don't panic too much that they're taking away your CTR opportunity. This made me realize that Keyword Explorer is probably overestimating the degree to which Knowledge Panels remove organic CTR (e.g. Alice Springs, which has only a Knowledge Panel next to 10 blue links, has a CTR opportunity of 64).

#8: What percent of clicks on Google search results go to image blocks?

Images are one of the big shockers of this report overall (more on that later). While MozCast has image blocks in ~11% of Google results, Jumpshot's data shows images earn 3% of all Google search clicks.

I think this happens because people are naturally drawn to images and because Google uses click data to specifically show images that earn the most engagement. If you're wondering why your perfectly optimized image isn't ranking as well in Google Images as you hoped, we've got strong suspicions and some case studies suggesting it might be because your visual doesn't draw the eye and the click the way others do.

If Google only shows compelling images and only shows the image block in search results when they know there's high demand for images (i.e. people search the web, then click the "image" tab at the top), then little wonder images earn strong clicks in Google's results.

#9: What percent of clicks on Google search results go to News/Top Stories results?

Gah! We don't know for now. This one was frustrating and couldn't be gathered due to Google's untimely switch from "News Results" to "Top Stories," some of which happened during the data collection period. We hope to have this in the summer, when we'll be collecting and comparing results again.

#10: What percent of clicks on Google search results go to Twitter block results?

I was expecting this one to be relatively small, and it is, though it slightly exceeded my expectations. MozCast has tweet blocks showing in ~7% of SERPs, and Jumpshot shows those tweets earning ~0.23% of all clicks.

My guess is that the tweets do very well for a small set of search queries, and tend to be shown less (or shown lower in the results) over time if they don't draw the click. As an example, search results for my name show the tweet block between organic position #1 and #2 (either my tweets are exciting or the rest of my results aren't). Compare that to David Mihm, who tweeted very seldomly for a long while and has only recently been more active — his tweets sit between positions #4 and #5. Or contrast with Dr. Pete, whose tweets are above the #1 spot!

#11: What percent of clicks on Google search results go to YouTube?

Technically, there are rare occasions when a video from another provider (usually Vimeo) can appear in Google's SERPs directly. But more than 99% of videos in Google come from YouTube (which violates anti-competitive laws IMO, but since Google pays off so many elected representatives, it's likely not an issue for them). Thus, we chose to study only YouTube rather than all video results.

MozCast shows videos in 6.3% of results, just below tweets. In Jumpshot's data, YouTube's engagement massively over-performed its raw visibility, drawing 1.8% of all search clicks. Clearly, for those searches with video intent behind them, YouTube is delivering well.

#12: What percent of clicks on Google search results go to personalized Gmail/Google Mail results?

I had no guess at all on this one, and it's rarely discussed in the SEO world because it's so relatively difficult to influence and obscure. We don't have tracking data via MozCast because these only show in personalized results for folks logged in to their Gmail accounts when searching, and Google chooses to only show them for certain kinds of queries.

Jumpshot, however, thanks to clickstream tracking, can see that 0.16% of search clicks go to Gmail or Google Mail following a query, only a little under the number of clicks to tweets.

#13: What percent of clicks on Google search results go to Google Shopping results?

The Google Shopping ads have become pretty compelling — the visuals are solid, the advertisers are clearly spending lots of effort on CTR optimization, and the results, not surprisingly, reflect this.

MozCast has Shopping results in 9% of queries, while clickstream data shows those results earning 0.55% of all search clicks.

#14: What percent of Google searches result in a click on a Google property?

Google has earned a reputation over the last few years of taking an immense amount of search traffic for themselves — from YouTube to Google Maps to Gmail to Google Books and the Google App Store on mobile, and even Google+, there's a strong case to be made that Google's eating into opportunity for 3rd parties with bets of their own that don't have to play by the rules.

Honestly, I'd have estimated this in the 20–30 percent range, so it surprised me to see that, from Jumpshot's data, all Google properties earned only 11.8% of clicks from distinct searches (only 8.4% across all searches). That's still significant, of course, and certainly bigger than it was 5 years ago, but given that we know Google's search volume has more than doubled in the last 5 years, we have to be intellectually honest and say that there's vastly more opportunity in the crowded-with-Google's-own-properties results today than there was in the cleaner-but-lower-demand SERPs of 5 years ago.

#15: What percent of all searches happen on any major search property in the US?

I asked Jumpshot to compare 10 distinct web properties, add together all the searches they receive combined, and share the percent distribution. The results are FASCINATING!

Here they are in order:

  1. Google.com 59.30%
  2. Google Images 26.79%
  3. YouTube.com 3.71%
  4. Yahoo! 2.47%
  5. Bing 2.25%
  6. Google Maps 2.09%
  7. Amazon.com 1.85%
  8. Facebook.com 0.69%
  9. DuckDuckGo 0.56%
  10. Google News 0.28%

I've also created a pie chart to help illustrate the breakdown:

Distribution of US Searches October 2016

If the Google Images data shocks you, you're not alone. I was blown away by the popularity of image search. Part of me wonders if Halloween could be responsible. We should know more when we re-collect and re-analyze this data for the summer.

Images wasn't the only surprise, though. Bing and Yahoo! combine for not even 1/10th of Google.com's search volume. DuckDuckGo, despite their tiny footprint compared to Facebook, have almost as many searches as the social media giant. Amazon has almost as many searches as Bing. And YouTube.com's searches are nearly twice the size of Bing's (on web browsers only — remember that Jumpshot won't capture searches in the YouTube app on mobile, tablet, or TV devices).

For the future, I also want to look at data for Google Shopping, MSN, Pinterest, Twitter, LinkedIn, Gmail, Yandex, Baidu, and Reddit. My suspicion is that none of those have as many searches as those above, but I'd love to be surprised.

BTW — if you're questioning this data compared to Comscore or Nielsen, I'd just point out that Jumpshot's panel is vastly larger, and their methodology is much cleaner and more accurate, too (at least, IMO). They don't do things like group site searches on Microsoft-owned properties into Bing's search share or try to statistically sample and merge methodologies, and whereas Comscore has a *global* panel of 2 million, Jumpshot's *US-only* panel of devices is considerably larger.

#16: What's the distribution of search demand across keywords?

Let's go back to looking only at keyword searches on Google. Based on October's searches, the top 1MM queries accounts for about 25% of all searches with the top 10MM queries accounting for about 45% and the top 1BB queries accounting for close to 90%. Jumpshot's kindly illustrated this for us:

The long tail is still very long indeed, with a huge amount of search volume taking place in keywords outside the top 10 million most-searched-for queries. In fact, almost 25% of all search volume happens outside the top 100 million keywords!

I illustrated this last summer with data from Russ' analysis based on Clickstre.am data, and it matches up fairly well (though not exactly; Jumpshot's panel is far larger).

#17: How many words does the average desktop vs. mobile searcher use in their queries?

According to Jumpshot, a typical searcher uses about 3 words in their search query. Desktop users have a slightly higher query length due to having a slightly higher share of queries of 6 words or more than mobile (16% for desktop vs. 14% for mobile).

I was actually surprised to see how close desktop and mobile are. Clearly, there's not as much separation in query formation as some folks in our space have estimated (myself included).

#18: What percent of queries are phrased as questions?

For this data, Jumpshot used any queries that started with the typical "Who," "What," "Where," "When," "Why," and "How," as well as "Am" (e.g. Am I registered to vote?) and "Is" (e.g. Is it going to rain tomorrow?). The data showed that ~8% of search queries are phrased as questions .

#19: What is the difference in paid vs. organic CTR on mobile compared to desktop?

This is one of those data points I've been longing for over many years. We've always suspected CTR on mobile is lower than on desktop, and now it's confirmed.

For mobile devices, 40.9% of Google searches result in an organic click, 2% in a paid click, and 57.1% in no click at all. For desktop devices, 62.2% of Google searches result in an organic click, 2.8% in a paid click, and 35% in no click. That's a pretty big delta, and one that illustrates how much more opportunity there still is in SEO vs. PPC. SEO has ~20X more traffic opportunity than PPC on both mobile and desktop. If you've been arguing that mobile has killed SEO or that SERP features have killed SEO or, really, that anything at all has killed SEO, you should probably change that tune.

#20: What percent of queries on Google result in the searcher changing their search terms without clicking any results?

You search. You don't find what you're seeking. So, you change your search terms, or maybe you click on one of Google's "Searches related to..." at the bottom of the page.

I've long wondered how often this pattern occurs, and what percent of search queries lead not to an answer, but to another search altogether. The answer is shockingly big: a full 18% of searches lead to a change in the search query!

No wonder Google has made related searches and "people also ask" such a big part of the search results in recent years.

#21: What percent of Google queries lead to more than one click on the results?

Some of us use ctrl+click to open up multiple tabs when searching. Others click one result, then click back and click another. Taken together, all the search behaviors that result in more than one click following a single search query in a session combine for 21%. That's 21% of searches that lead to more than one click on Google's results.

#22: What percent of Google queries result in pogo-sticking (i.e. the searcher clicks a result, then bounces back to the search results page and chooses a different result)?

As SEOs, we know pogo-sticking is a bad thing for our sites, and that Google is likely using this data to reward pages that don't get many pogo-stickers and nudge down those who do. Altogether, Jumpshot's October data saw 8% of searches that followed this pattern of search > click > back to search > click a different result.

Over time, if Google's successful at their mission of successfully satisfying more searchers, we'd expect this to go down. We'll watch that the next time we collect results and see what happens.

#23: What percent of clicks on non-Google properties in the search results go to a domain in the top 100?

Many of us in the search and web marketing world have been worried about whether search and SEO are becoming "winner-take-all" markets. Thus, we asked Jumpshot to look at the distribution of clicks to the 100 domains that received the most Google search traffic (excluding Google itself) vs. those outside the top 100.

The results are somewhat relieving: 12.6% of all Google clicks go to the top 100 search-traffic-receiving domains. The other 87.4% are to sites in the chunky middle and long tail of the search-traffic curve.


Phew! That's an immense load of powerful data, and over time, as we measure and report on this with our Jumpshot partners, we're looking forward to sharing trends and additional numbers, too.

If you've got a question about searcher behavior or search/click patterns, please feel free to leave it in the comments. I'll work with Russ and Randy to prioritize those requests and make the data available. It's my goal to have updated numbers to share at this year's MozCon in July.


** The following questions and responses from Jumpshot can illustrate some of the data and methodology's limitations:

Rand: What search sources, if any, might be missed by Jumpshot's methodology?
Jumpshot: We only looked at Google.com, except for the one question that asked specifically about Amazon, YouTube, DuckDuckGo, etc.

Rand: Do you, for example, capture searches performed in all Google apps (maps, search app, Google phone native queries that go to the web, etc)?
Jumpshot: Nothing in-app, but anything that opens a mobile browser — yes.

Rand: Do you capture all voice searches?
Jumpshot: If it triggers a web browser either on desktop or on mobile, then yes.

Rand: Is Google Home included?
Jumpshot: No.

Rand: Are searches on incognito windows included?
Jumpshot: Yes, should be since the plug-in is at the device level, we track any URL regardless.

Rand: Would searches in certain types of browsers (desktop or mobile) not get counted?
Jumpshot: From a browser perspective, no. But remember we have no iOS data so any browser being used on that platform will not be recorded.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, March 13, 2017

Google Algorithmic Penalties Still Happen, Post-Penguin 4.0

Posted by MichaelC-15022

When Penguin 4.0 launched in September 2016, the story from Gary Illyes of Google was that Penguin now just devalued spammy links, rather than penalizing a site by adjusting the site's ranking downward, AKA a penalty.

Apparently for Penguin there is now "less need" for a disavow, according to a Facebook discussion between Gary Illyes and Barry Schwartz of Search Engine Land back in September. He suggested that webmasters can help Google find spammy sites by disavowing links they know are bad. He also mentioned that manual actions still happen — and so I think we can safely infer that the disavow file is still useful in manual penalty recovery.

But algorithmic penalties DO still exist. A client of mine, who'd in the past built a lot of really spammy links to one of their sites, had me take a look at their backlinks about 10 days ago and build a disavow file. There was no manual penalty indicated in Search Console, but they didn't rank at all for terms they were targeting — and they had a plenty strong backlink profile even after ignoring the spammy links.

I submitted the disavow file on March 2nd, 2017. Here's the picture of what happened to their traffic:

4 days after the disavow file submission, their traffic went from just a couple hundred visits/day from Google search to nearly 3,000.

Penguin might no longer be handing out penalties, but clearly there are still algorithmic penalties handed out by Google. And clearly, the disavow file still works on these algorithmic penalties.

Perhaps we just need to give them another animal name. (Personally, I like the Okapi... goes along with the black-and-white animal theme, and, like Google algorithmic penalties, hardly anyone knows they still exist.)

Image courtesy Chester Zoo on Flickr.

I look forward to animated comments from other SEOs and webmasters who might have been suspecting the same thing!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Rankings Correlation Study: Domain Authority vs. Branded Search Volume

Posted by Tom.Capper

A little over two weeks ago I had the pleasure of speaking at SearchLove San Diego. My presentation, Does Google Still Need Links, looked at the available evidence on how and to what extent Google is using links as a ranking factor in 2017, including the piece of research that I’m sharing here today.

One of the main points of my presentation was to argue that while links still do represent a useful source of information for Google’s ranking algorithm, Google now has many other sources, most of which they would never have dreamed of back when PageRank was conceived as a proxy for the popularity and authority of websites nearly 20 years ago.

Branded search volume is one such source of information, and one of the sources that is most accessible for us mere mortals, so I decided to take a deeper look on how it compared with a link-based metric. It also gives us some interesting insight into the KPIs we should be pursuing in our off-site marketing efforts — because brand awareness and link building are often conflicting goals.

For clarity, by branded search volume, I mean the monthly regional search volume for the brand of a ranking site. For example, for the page http://ift.tt/2lSmB5z, this would be the US monthly search volume for the term “walmart” (as given by Google Keyword Planner). I’ve written more about how I put together this dataset and dealt with edge cases below.

When picking my link-based metric for comparison, domain authority seemed a natural choice — it’s domain-level, which ought to be fair given that generally that’s the level of precision with which we can measure branded search volume, and it came out top in Moz’s study of domain-level link-based factors.

A note on correlation studies

Before I go any further, here’s a word of warning on correlation studies, including this one: They can easily miss the forest for the trees.

For example, the fact that domain authority (or branded search volume, or anything else) is positively correlated with rankings could indicate that any or all of the following is likely:

  • Links cause sites to rank well
  • Ranking well causes sites to get links
  • Some third factor (e.g. reputation or age of site) causes sites to get both links and rankings

That’s not to say that correlation studies are useless — but we should use them to inform our understanding and prompt further investigation, not as the last word on what is and isn’t a ranking factor.

Methodology

(Or skip straight to the results!)

The Moz study referenced above used the provided 800 sample keywords from all 22 top-level categories in Google Keyword Planner, then looked at the top 50 results for each of these. After de-duplication, this results in 16,521 queries. Moz looked at only web results (no images, answer boxes, etc.), ignored queries with fewer than 25 results in total, and, as far as I can tell, used desktop rankings.

I’ve taken a slightly different approach. I reached out to STAT to request a sample of ~5,000 non-branded keywords for the US market. Like Moz, I stripped out non-web results, but unlike Moz, I also stripped out anything with a baserank worse than 10 (baserank being STAT’s way of presenting the ranking of a search result when non-web results are excluded). You can see the STAT export here.

Moz used Mean Spearman correlations, which is a process that involves ranking variables for each keyword, then taking the average correlation across all keywords. I’ve also chosen this method, and I’ll explain why using the below example:

Keyword

SERP Ranking Position

Ranking Site

Branded Search Volume of Ranking Site

Per Keyword Rank of Branded Search Volume

Keyword A

1

example1.com

100,000

1

Keyword A

2

example2.com

10,000

2

Keyword A

3

example3.com

1,000

3

Keyword A

4

example4.com

100

4

Keyword A

5

example5.com

10

5

For Keyword A, we have wildly varying branded search volumes in the top 5 search results. This means that search volume and rankings could never be particularly well-correlated, even though the results are perfectly sorted in order of search volume.

Moz’s approach avoids this problem by comparing the ranking position (the 2nd column in the table) with the column on the far right of the table — how each site ranks for the given variable.

In this case, correlating ranking directly with search volume would yield a correlation of (-)0.75. Correlating with ranked search volume yields a perfect correlation of 1.

This process is then repeated for every keyword in the sample (I counted desktop and mobile versions of the same keyword as two keywords), then the average correlation is taken.

Defining branded search volume

Initially, I thought that pulling branded search volume for every site in the sample would be as simple as looking up the search volume for their domain minus its subdomain and TLD (e.g. “walmart” for http://ift.tt/2lSmB5z). However, this proved surprisingly deficient. Take these examples:

  • www.cruise.co.uk
  • ecotalker.wordpress.com
  • www.sf.k12.sd.us

Are the brands for these sites “cruise,” “wordpress,” and “sd,” respectively? Clearly not. To figure out what the branded search term was, I started by taking each potential candidate from the URL, e.g., for ecotalker.wordpress.com:

  • Ecotalker
  • Ecotalker wordpress
  • Wordpress.com
  • Wordpress

I then worked out what the highest search volume term was for which the subdomain in question ranked first — which in this case is a tie between “Ecotalker” and “Ecotalker wordpress,” both of which show up as having zero volume.

I’m leaning fairly heavily on Google’s synonym matching in search volume lookup here to catch any edge-edge-cases — for example, I’m confident that “ecotalker.wordpress” would show up with the same search volume as “ecotalker wordpress.”

You can see the resulting dataset of subdomains with their DA and branded search volume here.

(Once again, I’ve used STAT to pull the search volumes in bulk.)

The results: Brand awareness > links

Here’s the main story: branded search volume is better correlated with rankings than domain authority is.

However, there’s a few other points of interest here. Firstly, neither of these variables has a particularly strong correlation with rankings — a perfect correlation would be 1, and I’m finding a correlation between domain authority and rankings of 0.071, and a correlation between branded search volume and rankings of 0.1. This is very low by the standards of the Moz study, which found a correlation of 0.26 between domain authority and rankings using the same statistical methods.

I think the biggest difference that accounts for this is Moz’s use of 50 web results per query, compared to my use of 10. If true, this would imply that domain authority has much more to do with what it takes to get you onto the front page than it has to do with ranking in the top few results once you’re there.

Another potential difference is in the types of keyword in the two samples. Moz’s study has a fairly even breakdown of keywords between the 0–10k, 10k–20k, 20k–50k, and 50k+ buckets:

On the other hand, my keywords were more skewed towards the low end:

However, this doesn’t seem to be the cause of my lower correlation numbers. Take a look at the correlations for rankings for high volume keywords (10k+) only in my dataset:

Although the matchup between the two metrics gets a lot closer here, the overall correlations are still nowhere near as high as Moz’s, leading me to attribute that difference more to their use of 50 ranking positions than to the keywords themselves.

It’s worth noting that my sample size of high volume queries is only 980.

Regression analysis

Another way of looking at the relationship between two variables is to ask how much of the variation in one is explained by the other. For example, the average rank of a page in our sample is 5.5. If we have a specific page that ranks at position 7, and a model that predicts it will rank at 6, we have explained 33% of its variation from the average rank (for that particular page).

Using the data above, I constructed a number of models to predict the rankings of pages in my sample, then charted the proportion of variance explained by those models below (you can read more about this metric, normally called the R-squared, here).

Some explanations:

  • Branded Search Volume of the ranking site - as discussed above
  • Log(Branded Search Volume) - Taking the log of the branded search volume for a fairer comparison with domain authority, where, for example, a DA 40 site is much more than twice as well linked to as a DA 20 site.
  • Ranked Branded Search Volume - How this site’s branded search volume compares to that of other sites ranking for the same keyword, as discussed above

Firstly, it’s worth noting that despite the very low R-squareds, all of the variables listed above were highly statistically significant — in the worst case scenario, within a one ten-millionth of a percent of being 100% significant. (In the best case scenario being a vigintillionth of a vigintillionth of a vigintillionth of a nonillionth of a percent away.)

However, the really interesting thing here is that including ranked domain authority and ranked branded search volume in the same model explains barely any more variation than just ranked branded search volume on its own.

To be clear: Nearly all of the variation in rankings that we can explain with reference to domain authority we could just as well explain with reference to branded search volume. On the other hand, the reverse is not true.

If you’d like to look into this data some more, the full set is here.

Nice data. Why should I care?

There are two main takeaways here:

  1. If you care about your domain authority because it’s correlated with rankings, then you should care at least as much about your branded search volume.
  2. The correlation between links and rankings might sometimes be a bit of a red-herring — it could be that links are themselves merely correlated with some third factor which better explains rankings.

There are also a bunch of softer takeaways to be had here, particularly around how weak (if highly statistically significant) both sets of correlations were. This places even more emphasis on relevancy and intent, which presumably make up the rest of the picture.

If you’re trying to produce content to build links, or if you find yourself reading a post or watching a presentation around this or any other link building techniques in the near future, there are some interesting questions here to add to those posed by Tomas Vaitulevicius back in November. In particular, if you’re producing content to gain links and brand awareness, it might not be very good at either, so you need to figure out what’s right for you and how to measure it.

I’m not saying in any of this that “links are dead,” or anything of the sort — more that we ought to be a bit more critical about how, why, and when they’re important. In particular, I think that they might be of increasingly little importance on the first page of results for competitive terms, but I’d be interested in your thoughts in the comments below.

I’d also love to see others conduct similar analysis. As with any research, cross-checking and replication studies are an important step in the process.

Either way, I’ll be writing more around this topic in the near future, so watch this space!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!