Thursday, March 16, 2017

Your Daily SEO Fix: Link Building & Ranking Zero

Posted by FeliciaCrawford

Last week, we shared a series of four short videos intended to help you accomplish some easy wins using Moz Pro: Your Daily SEO Fix: The Keyword Research Edition. Week Two (that's this week!) is focused on link building, identifying opportunities to take over SERP features, and doing that all-important competitive research.

This time around, we're using a mix of Open Site Explorer, Fresh Web Explorer, and Moz Pro. Open Site Explorer has some free capabilities, so if you'd like to follow along...

Open OSE in a new tab!

If you're a Moz Pro subscriber, crack open your campaigns and settle in. If you'd like to see what all the fuss is about without committing, you can dip your toes in with a free 30-day trial. And now that that's out of the way, let's get started!


Fix #1: Link building & brand building via unlinked mentions

"Moz" is an SEO software company, yes, but it's also Morissey's nickname and short for "Mozambique." All three of those things get mentioned around the web a bunch on any given day, but if we want to identify link building opportunities just to our site, it could get confusing quick. Luckily, Jordan's here to explain how to quickly find unlinked mentions of your site or brand using Open Site Explorer and keep those pesky Smiths references out of your results.


Fix #2: Prioritizing and organizing your link building efforts

Link building requires more than just finding opportunities, of course. April shows how you can prioritize your efforts by identifying the most valuable linking opportunities in Open Site Explorer, then dives into how you can cultivate a continuous stream of fresh related content ripe for a link-back with Fresh Web Explorer.


Fix #3: Ranking in position zero with SERP features in Moz Pro

If you have keywords that aren't ranking in the first few results pages, don't despair — there's hope yet. There are tons of opportunities to rank above the first organic result with the prevalence of SERP features. In this video, Ellie shows how you can identify keywords that need some love, track SERP feature opportunities for them, filter your keywords to show only those that surface certain SERP features, and more.


Fix #4: Gleaning insights from your competitors' backlink profiles

Remember April from Fix #2? She's back and ready to show you how to get the skinny on your competitors' juicy backlink profiles using both your Moz Pro campaign and Open Site Explorer.


One step beyond

That wraps up our latest week of fixes! We've got one last round coming at you next Thursday. As always, if you're curious and want to follow along, you can try it all out firsthand by taking a free trial of Moz Pro. We also offer several SEO bootcamp courses that can get you started on fundamentals if this whole SEO thing is pretty new to you.

If you're looking for some more meaty info on these topics, I've put together a short list of light reading for you:

Thanks for reading along, friends, and we'll see you again for the last installment of the Daily SEO Fix series next week!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, March 14, 2017

The State of Searcher Behavior Revealed Through 23 Remarkable Statistics

Posted by randfish

One of the marketing world's greatest frustrations has long been the lack of data from Google and other search engines about the behavior of users on their platforms. Occasionally, Google will divulge a nugget of bland, hard-to-interpret information about how they process more than X billion queries, or how many videos were uploaded to YouTube, or how many people have found travel information on Google in the last year. But these numbers aren't specific enough, well-sourced enough, nor do they provide enough detail to be truly useful for all the applications we have.

Marketers need to know things like: How many searches happen each month across various platforms? Is Google losing market share to Amazon? Are people really starting more searches on YouTube than Bing? Is Google Images more or less popular than Google News? What percent of queries are phrased as questions? How many words are in the average query? Is it more or less on mobile?

These kinds of specifics help us know where to put our efforts, how to sell our managers, teams, and clients on SEO investments, and, when we have this data over time, we can truly understand how this industry that shapes our livelihoods is changing. Until now, this data has been somewhere between hard and impossible to estimate. But, thanks to clickstream data providers like Jumpshot (which helps power Moz's Keyword Explorer and many of our keyword-based metrics in Pro), we can get around Google's secrecy and see the data for ourselves!

Over the last 6 months, Russ Jones and I have been working with Jumpshot's Randy Antin, who's been absolutely amazing — answering our questions late at night, digging in with his team to get the numbers, and patiently waiting while Russ runs fancy T-Distributions on large datasets to make sure our estimates are as accurate as possible. If you need clickstream data of any kind, I can't recommend them enough.

If you're wondering, "Wait... I think I know what clickstream data is, but you should probably tell me, Rand, just so I know that you know," OK. :-) Clickstream monitoring means Jumpshot (and other companies like them — SimilarWeb, Clickstre.am, etc.) have software on the device that records all the pages visited in a browser session. They anonymize and aggregate this data (don't worry, your searches and visits are not tied to you or to your device), then make parts of it available for research or use in products or through APIs. They're not crawling Google or any other sites, but rather seeing the precise behavior of devices as people use them to surf or search the Internet.

Clickstream data is awesomely powerful, but when it comes to estimating searcher behavior, we need scale. Thankfully, Jumpshot can deliver here, too. Their US panel of Internet users is in the millions (they don't disclose exact size, but it's between 2–10) so we can trust these numbers to reliably paint a representative picture. That said, there may still be biases in the data — it could be that certain demographics of Internet users are more or less likely to be in Jumpshot's panel, their mobile data is limited to Android (no iOS), and we know that some alternative kinds of searches aren't captured by their methodology**. Still, there's amazing stuff here, and it's vastly more than we've been able to get any other way, so let's dive in.

23 Search Behavior Stats

Methodology: All of the data was collected from Jumpshot's multi-million user panel in October 2016. T-distribution scaling was applied to validate the estimates of overall searches across platforms. All other data is expressed as percentages. Jumpshot's panel includes mobile and desktop devices in similar proportions, though no devices are iOS, so users on Macs, iPhones, and iPads are not included.

#1: How many searches are *really* performed on Google.com each month?

On the devices and types of queries Jumpshot can analyze, there were an average of 3.4 searches/day/searcher. Using the T-Distribution scaling analysis on various sample set sizes of Jumpshot's data, Russ estimated that the most likely reality is that between 40–60 billion searches happen on Google.com in the US each month.

Here's more detail from Russ himself:

"...All of the graphs are non-linear in shape, which indicates that as the samples get bigger we are approaching correct numbers but not in a simple % relationship... I have given 3 variations based on the estimated number of searches you think happen in the US annually. I have seen wildly different estimates from 20 billion to 100 billion, so I gave a couple of options. My gut is to go with the 40 billion numbers, especially since once we reach the 100MM line for 40 and 60B, there is little to no increase for 1 billion keywords, which would indicate we have reached a point where each new keyword is searched just 1 time."

How does that compare to numbers Google's given? Well, in May of 2016, Google told Search Engine Land they "processed at least 2 trillion searches per year." Using our Jumpshot-based estimates, and assuming October of 2016 was a reasonably average month for search demand, we'd get to 480–720 billion annual searches. That's less than half of what Google claims, but Google's number is WORLDWIDE! Jumpshot's data here is only for the US. This suggests that, as Danny Sullivan pointed out in the SELand article, Google could well be handling much, much more than 2 trillion annual searches.

Note that we believe our 40–60 billion/month number is actually too low. Why? Voice searches, searches in the Google app and Google Home, higher search use on iOS (all four of which Jumpshot can't measure), October could be a lower-than-average month, some kinds of search partnerships, and automated searches that aren't coming from human beings on their devices could all mean our numbers are undercounting Google's actual US search traffic. In the future, we'll be able to measure interesting things like growth or shrinkage of search demand as we compare October 2016 vs other months.

#2: How long is the average Google search session?

Form the time of the initial query to the loading of the search results page and the selection of any results, plus any back button clicks to those SERPs and selection of new results, the all-in average was just under 1 minute. If that seems long, remember that some search sessions may be upwards of an hour (like when I research all the best ryokans in Japan before planning a trip — I probably clicked 7 pages deep into the SERPs and opened 30 or more individual pages). Those long sessions are dragging up that average.

#3: What percent of users perform one or more searches on a given day?

This one blew my mind! Of the millions of active, US web users Jumpshot monitored in October 2016, only 15% performed at least one or more searches in a day. 45% performed at least one query in a week, and 68% performed one or more queries that month. To me, that says there's still a massive amount of search growth opportunity for Google. If they can make people more addicted to and more reliant on search, as well as shape the flow of information and the needs of people toward search engines, they are likely to have a lot more room to expand searches/searcher.

#4: What percent of Google searches result in a click?

Google is answering a lot of queries themselves. From searches like "Seattle Weather," to more complicated ones like "books by Kurt Vonnegut" or "how to remove raspberry stains?", Google is trying to save you that click — and it looks like they're succeeding.

66% of distinct search queries resulted in one or more clicks on Google's results. That means 34% of searches get no clicks at all. If we look at all search queries (not just distinct ones), those numbers shift to a straight 60%/40% split. I wouldn't be surprised to find that over time, we get closer and closer to Google solving half of search queries without a click. BTW — this is the all-in average, but I've broken down clicks vs. no-clicks on mobile vs. desktop in #19 below.

#5: What percent of clicks on Google search results go to AdWords/paid listings?

It's less than I thought, but perhaps not surprising given how aggressive Google's had to be with ad subtlety over the last few years. Of distinct search queries in Google, only 3.4% resulted in a click on an AdWords (paid) ad. If we expand that to all search queries, the number drops to 2.6%. Google's making a massive amount of money on a small fraction of the searches that come into their engine. No wonder they need to get creative (or, perhaps more accurately, sneaky) with hiding the ad indicator in the SERPs.

#6: What percent of clicks on Google search results go to Maps/local listings?

This is not measuring searches and clicks that start directly from maps.google.com or from the Google Maps app on a mobile device. We're talking here only about Google.com searches that result in a click on Google Maps. That number is 0.9% of Google search clicks, just under 1 in 100. We know from MozCast that local packs show up in ~15% of queries (though that may be biased by MozCast's keyword corpus).

#7: What percent of clicks on Google search results go to links in the Knowledge Graph?

Knowledge panels are hugely popular in Google's results — they show up in ~38% of MozCast's dataset. But they're not nearly as popular for search click activity, earning only ~0.5% of clicks.

I'm not totally surprised by that. Knowledge panels are, IMO, more about providing quick answers and details to searchers than they are about drawing the click themselves. If you see Knowledge Panels in your SERPs, don't panic too much that they're taking away your CTR opportunity. This made me realize that Keyword Explorer is probably overestimating the degree to which Knowledge Panels remove organic CTR (e.g. Alice Springs, which has only a Knowledge Panel next to 10 blue links, has a CTR opportunity of 64).

#8: What percent of clicks on Google search results go to image blocks?

Images are one of the big shockers of this report overall (more on that later). While MozCast has image blocks in ~11% of Google results, Jumpshot's data shows images earn 3% of all Google search clicks.

I think this happens because people are naturally drawn to images and because Google uses click data to specifically show images that earn the most engagement. If you're wondering why your perfectly optimized image isn't ranking as well in Google Images as you hoped, we've got strong suspicions and some case studies suggesting it might be because your visual doesn't draw the eye and the click the way others do.

If Google only shows compelling images and only shows the image block in search results when they know there's high demand for images (i.e. people search the web, then click the "image" tab at the top), then little wonder images earn strong clicks in Google's results.

#9: What percent of clicks on Google search results go to News/Top Stories results?

Gah! We don't know for now. This one was frustrating and couldn't be gathered due to Google's untimely switch from "News Results" to "Top Stories," some of which happened during the data collection period. We hope to have this in the summer, when we'll be collecting and comparing results again.

#10: What percent of clicks on Google search results go to Twitter block results?

I was expecting this one to be relatively small, and it is, though it slightly exceeded my expectations. MozCast has tweet blocks showing in ~7% of SERPs, and Jumpshot shows those tweets earning ~0.23% of all clicks.

My guess is that the tweets do very well for a small set of search queries, and tend to be shown less (or shown lower in the results) over time if they don't draw the click. As an example, search results for my name show the tweet block between organic position #1 and #2 (either my tweets are exciting or the rest of my results aren't). Compare that to David Mihm, who tweeted very seldomly for a long while and has only recently been more active — his tweets sit between positions #4 and #5. Or contrast with Dr. Pete, whose tweets are above the #1 spot!

#11: What percent of clicks on Google search results go to YouTube?

Technically, there are rare occasions when a video from another provider (usually Vimeo) can appear in Google's SERPs directly. But more than 99% of videos in Google come from YouTube (which violates anti-competitive laws IMO, but since Google pays off so many elected representatives, it's likely not an issue for them). Thus, we chose to study only YouTube rather than all video results.

MozCast shows videos in 6.3% of results, just below tweets. In Jumpshot's data, YouTube's engagement massively over-performed its raw visibility, drawing 1.8% of all search clicks. Clearly, for those searches with video intent behind them, YouTube is delivering well.

#12: What percent of clicks on Google search results go to personalized Gmail/Google Mail results?

I had no guess at all on this one, and it's rarely discussed in the SEO world because it's so relatively difficult to influence and obscure. We don't have tracking data via MozCast because these only show in personalized results for folks logged in to their Gmail accounts when searching, and Google chooses to only show them for certain kinds of queries.

Jumpshot, however, thanks to clickstream tracking, can see that 0.16% of search clicks go to Gmail or Google Mail following a query, only a little under the number of clicks to tweets.

#13: What percent of clicks on Google search results go to Google Shopping results?

The Google Shopping ads have become pretty compelling — the visuals are solid, the advertisers are clearly spending lots of effort on CTR optimization, and the results, not surprisingly, reflect this.

MozCast has Shopping results in 9% of queries, while clickstream data shows those results earning 0.55% of all search clicks.

#14: What percent of Google searches result in a click on a Google property?

Google has earned a reputation over the last few years of taking an immense amount of search traffic for themselves — from YouTube to Google Maps to Gmail to Google Books and the Google App Store on mobile, and even Google+, there's a strong case to be made that Google's eating into opportunity for 3rd parties with bets of their own that don't have to play by the rules.

Honestly, I'd have estimated this in the 20–30 percent range, so it surprised me to see that, from Jumpshot's data, all Google properties earned only 11.8% of clicks from distinct searches (only 8.4% across all searches). That's still significant, of course, and certainly bigger than it was 5 years ago, but given that we know Google's search volume has more than doubled in the last 5 years, we have to be intellectually honest and say that there's vastly more opportunity in the crowded-with-Google's-own-properties results today than there was in the cleaner-but-lower-demand SERPs of 5 years ago.

#15: What percent of all searches happen on any major search property in the US?

I asked Jumpshot to compare 10 distinct web properties, add together all the searches they receive combined, and share the percent distribution. The results are FASCINATING!

Here they are in order:

  1. Google.com 59.30%
  2. Google Images 26.79%
  3. YouTube.com 3.71%
  4. Yahoo! 2.47%
  5. Bing 2.25%
  6. Google Maps 2.09%
  7. Amazon.com 1.85%
  8. Facebook.com 0.69%
  9. DuckDuckGo 0.56%
  10. Google News 0.28%

I've also created a pie chart to help illustrate the breakdown:

Distribution of US Searches October 2016

If the Google Images data shocks you, you're not alone. I was blown away by the popularity of image search. Part of me wonders if Halloween could be responsible. We should know more when we re-collect and re-analyze this data for the summer.

Images wasn't the only surprise, though. Bing and Yahoo! combine for not even 1/10th of Google.com's search volume. DuckDuckGo, despite their tiny footprint compared to Facebook, have almost as many searches as the social media giant. Amazon has almost as many searches as Bing. And YouTube.com's searches are nearly twice the size of Bing's (on web browsers only — remember that Jumpshot won't capture searches in the YouTube app on mobile, tablet, or TV devices).

For the future, I also want to look at data for Google Shopping, MSN, Pinterest, Twitter, LinkedIn, Gmail, Yandex, Baidu, and Reddit. My suspicion is that none of those have as many searches as those above, but I'd love to be surprised.

BTW — if you're questioning this data compared to Comscore or Nielsen, I'd just point out that Jumpshot's panel is vastly larger, and their methodology is much cleaner and more accurate, too (at least, IMO). They don't do things like group site searches on Microsoft-owned properties into Bing's search share or try to statistically sample and merge methodologies, and whereas Comscore has a *global* panel of 2 million, Jumpshot's *US-only* panel of devices is considerably larger.

#16: What's the distribution of search demand across keywords?

Let's go back to looking only at keyword searches on Google. Based on October's searches, the top 1MM queries accounts for about 25% of all searches with the top 10MM queries accounting for about 45% and the top 1BB queries accounting for close to 90%. Jumpshot's kindly illustrated this for us:

The long tail is still very long indeed, with a huge amount of search volume taking place in keywords outside the top 10 million most-searched-for queries. In fact, almost 25% of all search volume happens outside the top 100 million keywords!

I illustrated this last summer with data from Russ' analysis based on Clickstre.am data, and it matches up fairly well (though not exactly; Jumpshot's panel is far larger).

#17: How many words does the average desktop vs. mobile searcher use in their queries?

According to Jumpshot, a typical searcher uses about 3 words in their search query. Desktop users have a slightly higher query length due to having a slightly higher share of queries of 6 words or more than mobile (16% for desktop vs. 14% for mobile).

I was actually surprised to see how close desktop and mobile are. Clearly, there's not as much separation in query formation as some folks in our space have estimated (myself included).

#18: What percent of queries are phrased as questions?

For this data, Jumpshot used any queries that started with the typical "Who," "What," "Where," "When," "Why," and "How," as well as "Am" (e.g. Am I registered to vote?) and "Is" (e.g. Is it going to rain tomorrow?). The data showed that ~8% of search queries are phrased as questions .

#19: What is the difference in paid vs. organic CTR on mobile compared to desktop?

This is one of those data points I've been longing for over many years. We've always suspected CTR on mobile is lower than on desktop, and now it's confirmed.

For mobile devices, 40.9% of Google searches result in an organic click, 2% in a paid click, and 57.1% in no click at all. For desktop devices, 62.2% of Google searches result in an organic click, 2.8% in a paid click, and 35% in no click. That's a pretty big delta, and one that illustrates how much more opportunity there still is in SEO vs. PPC. SEO has ~20X more traffic opportunity than PPC on both mobile and desktop. If you've been arguing that mobile has killed SEO or that SERP features have killed SEO or, really, that anything at all has killed SEO, you should probably change that tune.

#20: What percent of queries on Google result in the searcher changing their search terms without clicking any results?

You search. You don't find what you're seeking. So, you change your search terms, or maybe you click on one of Google's "Searches related to..." at the bottom of the page.

I've long wondered how often this pattern occurs, and what percent of search queries lead not to an answer, but to another search altogether. The answer is shockingly big: a full 18% of searches lead to a change in the search query!

No wonder Google has made related searches and "people also ask" such a big part of the search results in recent years.

#21: What percent of Google queries lead to more than one click on the results?

Some of us use ctrl+click to open up multiple tabs when searching. Others click one result, then click back and click another. Taken together, all the search behaviors that result in more than one click following a single search query in a session combine for 21%. That's 21% of searches that lead to more than one click on Google's results.

#22: What percent of Google queries result in pogo-sticking (i.e. the searcher clicks a result, then bounces back to the search results page and chooses a different result)?

As SEOs, we know pogo-sticking is a bad thing for our sites, and that Google is likely using this data to reward pages that don't get many pogo-stickers and nudge down those who do. Altogether, Jumpshot's October data saw 8% of searches that followed this pattern of search > click > back to search > click a different result.

Over time, if Google's successful at their mission of successfully satisfying more searchers, we'd expect this to go down. We'll watch that the next time we collect results and see what happens.

#23: What percent of clicks on non-Google properties in the search results go to a domain in the top 100?

Many of us in the search and web marketing world have been worried about whether search and SEO are becoming "winner-take-all" markets. Thus, we asked Jumpshot to look at the distribution of clicks to the 100 domains that received the most Google search traffic (excluding Google itself) vs. those outside the top 100.

The results are somewhat relieving: 12.6% of all Google clicks go to the top 100 search-traffic-receiving domains. The other 87.4% are to sites in the chunky middle and long tail of the search-traffic curve.


Phew! That's an immense load of powerful data, and over time, as we measure and report on this with our Jumpshot partners, we're looking forward to sharing trends and additional numbers, too.

If you've got a question about searcher behavior or search/click patterns, please feel free to leave it in the comments. I'll work with Russ and Randy to prioritize those requests and make the data available. It's my goal to have updated numbers to share at this year's MozCon in July.


** The following questions and responses from Jumpshot can illustrate some of the data and methodology's limitations:

Rand: What search sources, if any, might be missed by Jumpshot's methodology?
Jumpshot: We only looked at Google.com, except for the one question that asked specifically about Amazon, YouTube, DuckDuckGo, etc.

Rand: Do you, for example, capture searches performed in all Google apps (maps, search app, Google phone native queries that go to the web, etc)?
Jumpshot: Nothing in-app, but anything that opens a mobile browser — yes.

Rand: Do you capture all voice searches?
Jumpshot: If it triggers a web browser either on desktop or on mobile, then yes.

Rand: Is Google Home included?
Jumpshot: No.

Rand: Are searches on incognito windows included?
Jumpshot: Yes, should be since the plug-in is at the device level, we track any URL regardless.

Rand: Would searches in certain types of browsers (desktop or mobile) not get counted?
Jumpshot: From a browser perspective, no. But remember we have no iOS data so any browser being used on that platform will not be recorded.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, March 13, 2017

Google Algorithmic Penalties Still Happen, Post-Penguin 4.0

Posted by MichaelC-15022

When Penguin 4.0 launched in September 2016, the story from Gary Illyes of Google was that Penguin now just devalued spammy links, rather than penalizing a site by adjusting the site's ranking downward, AKA a penalty.

Apparently for Penguin there is now "less need" for a disavow, according to a Facebook discussion between Gary Illyes and Barry Schwartz of Search Engine Land back in September. He suggested that webmasters can help Google find spammy sites by disavowing links they know are bad. He also mentioned that manual actions still happen — and so I think we can safely infer that the disavow file is still useful in manual penalty recovery.

But algorithmic penalties DO still exist. A client of mine, who'd in the past built a lot of really spammy links to one of their sites, had me take a look at their backlinks about 10 days ago and build a disavow file. There was no manual penalty indicated in Search Console, but they didn't rank at all for terms they were targeting — and they had a plenty strong backlink profile even after ignoring the spammy links.

I submitted the disavow file on March 2nd, 2017. Here's the picture of what happened to their traffic:

4 days after the disavow file submission, their traffic went from just a couple hundred visits/day from Google search to nearly 3,000.

Penguin might no longer be handing out penalties, but clearly there are still algorithmic penalties handed out by Google. And clearly, the disavow file still works on these algorithmic penalties.

Perhaps we just need to give them another animal name. (Personally, I like the Okapi... goes along with the black-and-white animal theme, and, like Google algorithmic penalties, hardly anyone knows they still exist.)

Image courtesy Chester Zoo on Flickr.

I look forward to animated comments from other SEOs and webmasters who might have been suspecting the same thing!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Rankings Correlation Study: Domain Authority vs. Branded Search Volume

Posted by Tom.Capper

A little over two weeks ago I had the pleasure of speaking at SearchLove San Diego. My presentation, Does Google Still Need Links, looked at the available evidence on how and to what extent Google is using links as a ranking factor in 2017, including the piece of research that I’m sharing here today.

One of the main points of my presentation was to argue that while links still do represent a useful source of information for Google’s ranking algorithm, Google now has many other sources, most of which they would never have dreamed of back when PageRank was conceived as a proxy for the popularity and authority of websites nearly 20 years ago.

Branded search volume is one such source of information, and one of the sources that is most accessible for us mere mortals, so I decided to take a deeper look on how it compared with a link-based metric. It also gives us some interesting insight into the KPIs we should be pursuing in our off-site marketing efforts — because brand awareness and link building are often conflicting goals.

For clarity, by branded search volume, I mean the monthly regional search volume for the brand of a ranking site. For example, for the page http://ift.tt/2lSmB5z, this would be the US monthly search volume for the term “walmart” (as given by Google Keyword Planner). I’ve written more about how I put together this dataset and dealt with edge cases below.

When picking my link-based metric for comparison, domain authority seemed a natural choice — it’s domain-level, which ought to be fair given that generally that’s the level of precision with which we can measure branded search volume, and it came out top in Moz’s study of domain-level link-based factors.

A note on correlation studies

Before I go any further, here’s a word of warning on correlation studies, including this one: They can easily miss the forest for the trees.

For example, the fact that domain authority (or branded search volume, or anything else) is positively correlated with rankings could indicate that any or all of the following is likely:

  • Links cause sites to rank well
  • Ranking well causes sites to get links
  • Some third factor (e.g. reputation or age of site) causes sites to get both links and rankings

That’s not to say that correlation studies are useless — but we should use them to inform our understanding and prompt further investigation, not as the last word on what is and isn’t a ranking factor.

Methodology

(Or skip straight to the results!)

The Moz study referenced above used the provided 800 sample keywords from all 22 top-level categories in Google Keyword Planner, then looked at the top 50 results for each of these. After de-duplication, this results in 16,521 queries. Moz looked at only web results (no images, answer boxes, etc.), ignored queries with fewer than 25 results in total, and, as far as I can tell, used desktop rankings.

I’ve taken a slightly different approach. I reached out to STAT to request a sample of ~5,000 non-branded keywords for the US market. Like Moz, I stripped out non-web results, but unlike Moz, I also stripped out anything with a baserank worse than 10 (baserank being STAT’s way of presenting the ranking of a search result when non-web results are excluded). You can see the STAT export here.

Moz used Mean Spearman correlations, which is a process that involves ranking variables for each keyword, then taking the average correlation across all keywords. I’ve also chosen this method, and I’ll explain why using the below example:

Keyword

SERP Ranking Position

Ranking Site

Branded Search Volume of Ranking Site

Per Keyword Rank of Branded Search Volume

Keyword A

1

example1.com

100,000

1

Keyword A

2

example2.com

10,000

2

Keyword A

3

example3.com

1,000

3

Keyword A

4

example4.com

100

4

Keyword A

5

example5.com

10

5

For Keyword A, we have wildly varying branded search volumes in the top 5 search results. This means that search volume and rankings could never be particularly well-correlated, even though the results are perfectly sorted in order of search volume.

Moz’s approach avoids this problem by comparing the ranking position (the 2nd column in the table) with the column on the far right of the table — how each site ranks for the given variable.

In this case, correlating ranking directly with search volume would yield a correlation of (-)0.75. Correlating with ranked search volume yields a perfect correlation of 1.

This process is then repeated for every keyword in the sample (I counted desktop and mobile versions of the same keyword as two keywords), then the average correlation is taken.

Defining branded search volume

Initially, I thought that pulling branded search volume for every site in the sample would be as simple as looking up the search volume for their domain minus its subdomain and TLD (e.g. “walmart” for http://ift.tt/2lSmB5z). However, this proved surprisingly deficient. Take these examples:

  • www.cruise.co.uk
  • ecotalker.wordpress.com
  • www.sf.k12.sd.us

Are the brands for these sites “cruise,” “wordpress,” and “sd,” respectively? Clearly not. To figure out what the branded search term was, I started by taking each potential candidate from the URL, e.g., for ecotalker.wordpress.com:

  • Ecotalker
  • Ecotalker wordpress
  • Wordpress.com
  • Wordpress

I then worked out what the highest search volume term was for which the subdomain in question ranked first — which in this case is a tie between “Ecotalker” and “Ecotalker wordpress,” both of which show up as having zero volume.

I’m leaning fairly heavily on Google’s synonym matching in search volume lookup here to catch any edge-edge-cases — for example, I’m confident that “ecotalker.wordpress” would show up with the same search volume as “ecotalker wordpress.”

You can see the resulting dataset of subdomains with their DA and branded search volume here.

(Once again, I’ve used STAT to pull the search volumes in bulk.)

The results: Brand awareness > links

Here’s the main story: branded search volume is better correlated with rankings than domain authority is.

However, there’s a few other points of interest here. Firstly, neither of these variables has a particularly strong correlation with rankings — a perfect correlation would be 1, and I’m finding a correlation between domain authority and rankings of 0.071, and a correlation between branded search volume and rankings of 0.1. This is very low by the standards of the Moz study, which found a correlation of 0.26 between domain authority and rankings using the same statistical methods.

I think the biggest difference that accounts for this is Moz’s use of 50 web results per query, compared to my use of 10. If true, this would imply that domain authority has much more to do with what it takes to get you onto the front page than it has to do with ranking in the top few results once you’re there.

Another potential difference is in the types of keyword in the two samples. Moz’s study has a fairly even breakdown of keywords between the 0–10k, 10k–20k, 20k–50k, and 50k+ buckets:

On the other hand, my keywords were more skewed towards the low end:

However, this doesn’t seem to be the cause of my lower correlation numbers. Take a look at the correlations for rankings for high volume keywords (10k+) only in my dataset:

Although the matchup between the two metrics gets a lot closer here, the overall correlations are still nowhere near as high as Moz’s, leading me to attribute that difference more to their use of 50 ranking positions than to the keywords themselves.

It’s worth noting that my sample size of high volume queries is only 980.

Regression analysis

Another way of looking at the relationship between two variables is to ask how much of the variation in one is explained by the other. For example, the average rank of a page in our sample is 5.5. If we have a specific page that ranks at position 7, and a model that predicts it will rank at 6, we have explained 33% of its variation from the average rank (for that particular page).

Using the data above, I constructed a number of models to predict the rankings of pages in my sample, then charted the proportion of variance explained by those models below (you can read more about this metric, normally called the R-squared, here).

Some explanations:

  • Branded Search Volume of the ranking site - as discussed above
  • Log(Branded Search Volume) - Taking the log of the branded search volume for a fairer comparison with domain authority, where, for example, a DA 40 site is much more than twice as well linked to as a DA 20 site.
  • Ranked Branded Search Volume - How this site’s branded search volume compares to that of other sites ranking for the same keyword, as discussed above

Firstly, it’s worth noting that despite the very low R-squareds, all of the variables listed above were highly statistically significant — in the worst case scenario, within a one ten-millionth of a percent of being 100% significant. (In the best case scenario being a vigintillionth of a vigintillionth of a vigintillionth of a nonillionth of a percent away.)

However, the really interesting thing here is that including ranked domain authority and ranked branded search volume in the same model explains barely any more variation than just ranked branded search volume on its own.

To be clear: Nearly all of the variation in rankings that we can explain with reference to domain authority we could just as well explain with reference to branded search volume. On the other hand, the reverse is not true.

If you’d like to look into this data some more, the full set is here.

Nice data. Why should I care?

There are two main takeaways here:

  1. If you care about your domain authority because it’s correlated with rankings, then you should care at least as much about your branded search volume.
  2. The correlation between links and rankings might sometimes be a bit of a red-herring — it could be that links are themselves merely correlated with some third factor which better explains rankings.

There are also a bunch of softer takeaways to be had here, particularly around how weak (if highly statistically significant) both sets of correlations were. This places even more emphasis on relevancy and intent, which presumably make up the rest of the picture.

If you’re trying to produce content to build links, or if you find yourself reading a post or watching a presentation around this or any other link building techniques in the near future, there are some interesting questions here to add to those posed by Tomas Vaitulevicius back in November. In particular, if you’re producing content to gain links and brand awareness, it might not be very good at either, so you need to figure out what’s right for you and how to measure it.

I’m not saying in any of this that “links are dead,” or anything of the sort — more that we ought to be a bit more critical about how, why, and when they’re important. In particular, I think that they might be of increasingly little importance on the first page of results for competitive terms, but I’d be interested in your thoughts in the comments below.

I’d also love to see others conduct similar analysis. As with any research, cross-checking and replication studies are an important step in the process.

Either way, I’ll be writing more around this topic in the near future, so watch this space!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, March 10, 2017

Better Alternatives to "Expert Roundup"-Style Content - Whiteboard Friday

Posted by randfish

You may be tempted to publish that newest round of answers you've gotten from industry experts, but hold off — there's a better way. In today's Whiteboard Friday, Rand explains why expert roundups just aren't the best use of your time and effort, and how to pivot your strategy to create similar content that'll make the juice worth the squeeze.

Alternatives to expert roundup style content

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to look at some better alternatives to the expert roundup-style content that's become extremely popular on the web. There are a few reasons why it's popular. So let's talk about why SEOs and content marketers do so many expert roundups, why this became a popular content format.

Why do SEOs and content marketers even use "expert roundups?"

Okay. It turns out if you've got a piece of content that's like "75 Experts Share Their Favorite Constitutional Law Cases," maybe you interviewed a bunch of constitutional laws scholars and you put together this article, there's a bunch of nice things that you actually do get from this, which is why people use this format, right?

You kind of get automatic outreach, because if you talk to these people, you've had a connection with them. You've built a little bit of a relationship. There's now something of an incentive to share for these folks and the potential for a link. All of those are sort of elements that people are looking for, well, that marketers are looking for from their content.

The nice thing is you've got this long cadre of individuals who have contributed, and they create the content, which means you don't have to, saving you a bunch of time and energy. They become your amplifier so you can kind of sit back and relax when it comes time to broadcast it out there. You just tell them it's ready, and they go and push it. They lend your content credibility. So even if you don't have any credibility with your brand or with your website, they deliver it for you. You don't have to do that.

There are a few big problems with this kind of content.

Those are all really nice things. Don't get me wrong. I understand why. But there are some big, big problems with expert roundup-style content.

1. Like many easy-to-replicate tactics, expert roundups become WAY overdone.

First one, like many of the easy to replicate tactics, expert roundup has got spam to hack. They became way, way overdone. I get emails like this. "Dear Fishkin, I roundup. You write. Do this. Then share. Okay. Bye, Spammy McSpams-A-Lot."

Look, Mr. McSpams-A-Lot, I appreciate how often you think of me. I love that every day there are a couple of offers like this in my inbox. I try to contribute to less than one every two or three weeks and only the ones that look super credible and real interesting. But jeez, can you imagine if you are truly an expert, who can lend credibility and create lots of amplification, you're getting overwhelmed with these kinds of requests, and people are probably getting very tired of reading them, especially in certain market segments where they've become way too overdone.

2. It's hard for searchers to get valuable, useful info via this format — and search engines don't like it, either.

But even if it's the case that you can get all these experts to contribute and it's not overdone in your market space, there are two other big problems. One, the content format is awful, awful for trying to get valuable and useful information. It rarely actually satisfies either searchers or engines.

If you search for constitutional law cases and you see "75 Experts Share Their Favorite Constitutional Law Cases," you might click. But my god, have you gone through those types of content? Have you tried to read a lot of those roundups? They are usually awful, just terrible.

You might get a nugget here or there, but there's a bunch of contributions that are multiple paragraphs long and try to include links back to wherever the expert is trying to get their links going. There's a bunch of them that are short and meaningless. Many of them overlap.

It's annoying. It's bad. It's not well-curated. It's not well-put together. There are exceptions. Sometimes people put real effort into them and they get good, but most of the time these are real bad things, and you rarely see them in the search results.

BuzzSumo did a great analysis of content that gets shares and gets links and gets rankings. Guess what did not fall into it — expert roundups.

3. Roundups don't earn as many links, and the traffic spike from tweets is temporary.

Number three. That's number three. The links that the creators want from these roundups, that they're hoping they're going to get, it doesn't end up there most of the time. What usually happens is you get a short traffic spike, some additional engagement, some additional activity on mostly Twitter, sometimes a little bit Facebook or LinkedIn, but it's almost all social activity, and it's a very brief spike.

5 formats to try instead

So what are some better alternatives? What are some things we can do? Well, I've got five for you.

1. Surveys

First off, if you're going to be creating content that is around a roundup, why not do almost exactly the same process, but rather than asking a single question or a set of questions that people are replying to, ask them to fill out a short survey with a few data points, because then you can create awesome graphs and visuals, which have much stronger link earning potential. It's the same outreach effort, but for much more compelling content that often does a better job of ranking, is often more newsworthy and link worthy. I really, really like surveys, and I think that they can work tremendously well if you can put them together right.

2. Aggregations of public data

Second, let's say you go, "Oh, Rand, that would be great, but I want to survey people about this thing, and they won't give me the information that I'm looking for." Never fear. You can aggregate public data.

So a lot of these pieces of information that may be interesting to your audience, that you could use to create cool visuals, the graphs and charts and all that kind of thing and trend lines, are actually available on the web. All you need to do is cite those sources, pull in that data, build it yourself, and then you can outreach to the people who are behind these companies or these organizations or these individuals, and then say, "Hey, I made this based on public data. Can you correct any errors?" Now you've got the outreach, which can lead to the incentive to share and to build a link. Very cool.

3. Experiments and case studies

So this is taking a much smaller group, saying, "I'm only going to work with this one person or these couple of people, or I'm going to do it myself. Here's what Seattle's most influential law firm found when they challenged 10 state laws." Well, there you go. Now I've got an interesting, wholly formed case study. I only had to work with one expert, but chances are good that lots and lots of people will be interested in this. It's also excellent for newsworthiness. It often can get lots of press coverage in whatever industry you're in.

4. Seeking out controversial counter-opinions on a topic

Fourth, if you're going to do a roundup-style thing and you're going to collect multiple opinions, if you can find a few points or a single subject around which multiple experts have different opinions, that could be just two people, it could be four or five, it could be seven or eight, but you're basically trying to create this controversy.

You're saying like, "Here are these people on this side of this issue. Here are these people on this side of this issue, Wil Reynolds versus Rand Fishkin on link building." I think we did a presentation like that in Minneapolis last year or a couple years ago. It was super fun. Wil and I got up on stage, and we sort of debated with each other. There were no losers in that debate. It was great.

This leverages the emotional response you're seeking of conflict. It creates more engaging content by far, and there's more incentive for the parties who participate to link and share, because they're sort of showing off their opinion and trying to make counterpoints. You can get a lot of good things.

5. Not just text!

Number five. If you've decided, "You know what? None of these formats or any others work. I really, really want to do a roundup. I think it can work for me," okay. But do me a favor and try something that is not just text, not just text.

Muzli is a newsletter I subscribe to in the design world that does lots of roundup-style content, but the roundups are all visuals. They're visuals. They're like UI interactions and GIFs and animations and illustrations. I actually really love those. Those get great engagement, and they rank, by the way. They rank quite well. Many of the ones that they link to in the newsletter do well.

You can do this with visuals. You can do it with data. You could do it with revenue numbers. You could do it with tools. You could do it with products, whatever it is.

I would suggest thinking a little more broadly than, "Dear Fishkin, I roundup. You write." I think that there's a lot more opportunity outside of the pure expert roundup space, and I hope you'll share your creative ideas with us and the successes you've seen.

We look forward to seeing you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thursday, March 9, 2017

Your Daily SEO Fix: The Keyword Research Edition

Posted by FeliciaCrawford

Back in 2015, we had an epiphany. Every day, via every channel, you — our readers, subscribers, community members, and social followers — would ask us really good questions. (You're an incredibly intelligent, friendly, inquisitive bunch, you know that? It's humbling.) A lot of those questions were about how to accomplish your SEO goals, and it got us thinking.

Moz is an educational resource, it's true, but we also offer a suite of tools (both free and paid) that can help you achieve those goals. Why not provide a space for those two things to converge? And thus, the idea of the Daily SEO Fix was born: quick 1–3 minute videos shared throughout the week that feature Mozzers describing how to solve problems using the tools we know best.

It's two years later now, and both our tools and our industry have evolved. Time to revisit this idea, no?

Today's series of Daily SEO Fixes feature our keyword research tool, Keyword Explorer. Perhaps you've heard us mention it a couple times — we sure like it, and we think it could help you, too. And you don't have to be a subscriber to check this puppy out — anyone on the whole wide Internet can use it to research two queries a day for free. If you're logged into your Moz community account, you get five free queries.

Open Keyword Explorer in a new tab!

Queue it up in another browser tab to follow along, if you'd like!*

*Keep in mind that some features, such as lists, are only available when you're also a Moz Pro Medium subscriber or above. If you're bursting with curiosity, you can always check out the 30-day free trial, which features everything you'd see in a paid subscription... but for free. :)


Fix #1: Nitty-gritty keyword research

Let's get down to brass tacks: your keyword research. Janisha's here to walk you through...

  • Researching your keyword;
  • Determining whether it strikes the right balance of volume, difficulty, and opportunity;
  • How to quickly analyze the SERPs for your query and see what factors could be affecting your ranking opportunity;
  • Finding keyword suggestions ripe with promise; and
  • Organizing your newly discovered keywords into lists.

Fix #2: Finding question keywords to boost your content & win featured snippets

When you answer the questions searchers are actually asking, you've got way more opportunity to rank, earn qualified traffic to your site, and even win yourself a featured snippet or two. Brittani shows you how to broaden your page content by speaking to your audience's most burning questions.


Fix #3: Updating your keyword metrics on a whim

If you're hot on the trail of a good ranking, you don't have the time or patience to wait for your metrics to update on their own. Kristina shows you how to get that sweet, sweet, up-to-date data after you've organized a list of related keywords in Keyword Explorer.


Fix #4: Moving curated keyword lists to Moz Pro for long-term tracking

If you're interested in tracking the overall SEO progress of a site and digging into the nuts and bolts of your keyword data, you'll want to pay attention. Kristina's back to explain how to import your curated Keyword Explorer lists into a Moz Pro campaign to track long-term rankings for a specific site.


That's a wrap for Week 1!

There you have it — four ways to level up your keyword research and knock some to-dos off your list. We'll be back next Thursday with more fixes from a new group of Mozzers; keep an eye on our social channels for a sneak peek, and maybe try a free spin of Moz Pro if you'd like to follow along.

Curious about what else you can do with Keyword Explorer? Here are some fab resources:

And if you're fairly new to the game or looking for ways to grow your team members' SEO knowledge, be sure to check out our classes on introductory SEO, keyword research, site audits, link building, reporting, and more.

See you next week, friends!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Wednesday, March 8, 2017

SEO Rankings Drop: A Step-by-Step Guide to Recovery

Posted by KristinaKledzik

A few weeks ago, rankings for pages on a key section of my site dropped an average of a full position in one day. I’ve been an SEO for 7 years now, but I still ran around like a chicken with my head cut off, panicked that I wouldn’t be able to figure out my mistake. There are so many things that could’ve gone wrong: Did I or my team unintentionally mess with internal link equity? Did we lose links? Did one of Google’s now-constant algorithm updates screw me over?

Since the drop happened to a group of pages, I made the assumption it had to do with our site or page structure (it didn't). I wasted a good day focused on technical SEO. Once I realized my error, I decided to put together a guide to make sure that next time, I’ll do my research effectively. And you, my friends, will reap the rewards.

First, make sure there’s actually a rankings change

Okay, I have to start with this: before you go down this rabbit hole of rankings changes, make sure there was actually a rankings change. Your rankings tracker may not have localized properly, or have picked up on one of Google’s rankings experiments or personalization.

Find out:

  • Has organic traffic dropped to the affected page(s)?
    • We’re starting here because this is the most reliable data you have about your site. Google Search Console and rankings trackers are trying to look at what Google’s doing; your web analytics tool is just tracking user counts.
    • Compare organic traffic to the affected page(s) week-over-week both before and after the drop, making sure to compare similar days of the week.
    • Is the drop more significant than most week-over-week changes?
    • Is the drop over a holiday weekend? Is there any reason search volume could’ve dropped?
  • Does Google Search Console show a similar rankings drop?
    • Use the Search Analytics section to see clicks, impressions, and average position for a given keyword, page, or combo.
    • Does GSC show a similar rankings drop to what you saw in your rankings tracker? (Make sure to run the report with the selected keyword(s).)
  • Does your rankings tracker show a sustained rankings drop?
    • I recommend tracking rankings daily for your important keywords, so you’ll know if the rankings drop is sustained within a few days.
    • If you're looking for a tool recommendation, I'm loving Stat.

If you’ve just seen a drop in your rankings tool and your traffic and GSC clicks are still up, keep an eye on things and try not to panic. I’ve seen too many natural fluctuations to go to my boss as soon as I see an issue.

But if you’re seeing that there’s a rankings change, start going through this guide.

Figure out what went wrong

1. Did Google update their algorithm?

Google rolls out a new algorithm update at least every day, most silently. Good news is, there are leagues of SEOs dedicated to documenting those changes.

  • Are there any SEO articles or blogs talking about a change around the date you saw the change? Check out:
  • Do you have any SEO friends who have seen a change? Pro tip: Make friends with SEOs who run sites similar to yours, or in your industry. I can’t tell you how helpful it’s been to talk frankly about tests I’d like to run with SEOs who’ve run similar tests.

If this is your issue...

The bad news here is that if Google’s updated their algorithm, you're going to have to change your approach to SEO in one way or another.

Make sure you understand:

Your next move is to put together a strategy to either pull yourself out of this penalty, or at the very least to protect your site from the next one.

2. Did your site lose links?

Pull the lost links report from Ahrefs or Majestic. They’re the most reputable link counters out there, and their indexes are updated daily.

  • Has there been a noticeable site-wide link drop?
  • Has there been a noticeable link drop to the page or group of pages you’ve seen a rankings change for?
  • Has there been a noticeable link drop to pages on your site that link to the page or group of pages you’ve seen a rankings change for?
    • Run Screaming Frog on your site to find which pages link internally to the affected pages. Check internal link counts for pages one link away from affected pages.
  • Has there been a noticeable link drop to inbound links to the page or group of pages you’ve seen a rankings change for?
    • Use Ahrefs or Majestic to find the sites that link to your affected pages.
      • Have any of them suffered recent link drops?
      • Have they recently updated their site? Did that change their URLs, navigation structure, or on-page content?

If this is your issue...

The key here is to figure out who you lost links from and why, so you can try to regain or replace them.

  • Can you get the links back?
    • Do you have a relationship with the site owner who provided the links? Reaching out may help.
    • Were the links removed during a site update? Maybe it was accidental. Reach out and see if you can convince them to replace them.
    • Were the links removed and replaced with links to a different source? Investigate the new source — how can you make your links more appealing than theirs? Update your content and reach out to the linking site owner.
  • Can you convince your internal team to invest in new links to quickly replace the old ones?
    • Show your manager(s) how much a drop in link count affected your rankings and ask for the resources it’ll take to replace them.
    • This will be tricky if you were the one to build the now-lost links in the first place, so if you did, make sure you’ve put together a strategy to build longer-term ones next time.

3. Did you change the affected page(s)?

If you or your team changed the affected pages recently, Google may not think that they’re as relevant to the target keyword as they used to be.

  • Did you change the URL?
    • DO NOT CHANGE URLS. URLs act as unique identifiers for Google; a new URL means a new page, even if the content is the same.
  • Has the target keyword been removed from the page title, H1, or H2s?
  • Is the keyword density for the target keyword lower than it used to be?
  • Can Google read all of the content on the page?
    • Look at Google’s cache by searching for cache:http://ift.tt/2lWQyMG to see what Google sees.
  • Can Google access your site? Check Google Search Console for server and crawl reports.

If this is your issue…

Good news! You can probably revert your site and regain the traffic you’ve lost.

  • If you changed the URL, see if you can change it back. If not, make sure the old URL is 301 redirecting to the new URL.
  • If you changed the text on the page, try reverting it back to the old text. Wait until your rankings are back up, then try changing the text again, this time keeping keyword density in mind.
  • If Google can’t read all of the content on your page, THIS IS A BIG DEAL. Communicate that to your dev team. (I’ve found dev teams often undervalue the impact of SEO, but “Googlebot can’t read the page” is a pretty understandable, impactful problem.)

4. Did you change internal links to the affected page(s)?

If you or your team added or removed internal links, that could change the way link equity flows through your site, changing Google’s perceived value of the pages on your site.

  • Did you or your team recently update site navigation anywhere? Some common locations to check:
    • Top navigation
    • Side navigation
    • Footer navigation
    • Suggested products
    • Suggested blog posts
  • Did you or your team recently update key pages on your site that link to target pages? Some pages to check:
    • Homepage
    • Top category pages
    • Linkbait blog posts or articles
  • Did you or your team recently update anchor text on links to target pages? Does it still include the target keyword?

If this is your issue…

Figure out how many internal links have been removed from pointing to your affected pages. If you have access to the old version of your site, run Screaming Frog (or a similar crawler) on the new and old versions of your site so you can compare inbound link counts (referred to as inlinks in SF). If you don’t have access to the old version of your site, take a couple of hours to compare navigation changes and mark down wherever the new layout may have hurt the affected pages.

How you fix the problem depends on how much impact you have on the site structure. It’s best to fix the issue in the navigational structure of the site, but many of us SEOs are overruled by the UX team when it comes to primary navigation. If that’s the case for you, think about systematic ways to add links where you can control the content. Some common options:

  • In the product description
  • In blog posts
  • In the footer (since UX will generally admit, few people use the footer)

Keep in mind that removing links and adding them back later, or from different places on the site, may not have the same effect as the original internal links. You’ll want to keep an eye on your rankings, and add more internal links than the affected pages lost, to make sure you regain your Google rankings.

5. Google’s user feedback says you should rank differently.

Google is using machine learning to determine rankings. That means they’re at least in part measuring the value of your pages based on their click-through rate from SERPs and how long visitors stay on your page before returning to Google.

  • Did you recently add a popup that is increasing bounce rate?
  • Is the page taking longer to load?
    • Check server response time. People are likely to give up if nothing happens for a few seconds.
    • Check full page load. Have you added something that takes forever to load and is causing visitors to give up quickly?
  • Have you changed your page titles? Is that lowering CTR? (I optimized page titles in late November, and that one change moved the average rank of 500 pages up from 12 to 9. One would assume things can go in reverse.)

If this is your issue…

  • If the issue is a new popup, do your best to convince your marketing team to test a different type of popup. Some options:
    • Scroll popups
    • Timed popups
    • Exit popups
    • Stable banners at the top or bottom of the page (with a big CLICK ME button!)
  • If your page is taking longer to load, you’ll need the dev team. Put together the lost value from fewer SEO conversions now that you’ve lost some rankings and you'll have a pretty strong case for dev time.
  • If you’ve changed your page titles, change them back, quick! Mark this test as a dud, and make sure you learn from it before you run your next test.

6. Your competition made a change.

You may have changed rank not because you did anything, but because your competition got stronger or weaker. Use your ranking tool to identify competitors that gained or lost the most from your rankings change. Use a tool like Versionista (paid, but worth it) or Wayback Machine (free, but spotty data) to find changes in your competitors’ sites.

  • Which competitors gained or lost the most as your site’s rankings changed?
  • Has that competition gained or lost inbound links? (Refer to #2 for detailed questions)
  • Has that competition changed their competing page? (Refer to #3 for detailed questions)
  • Has that competition changed their internal link structure? (Refer to #4 for detailed questions)
  • Has that competition started getting better click-through rates or dwell time to their pages from SERPs? (Refer to #5 for detailed questions)

If this is your issue…

You’re probably fuming, and your managers are probably fuming at you. But there’s a benefit to this: you can learn about what works from your competitors. They did the research and tested a change, and it paid off for them. Now you know the value! Imitate your competitor, but try to do it better than them this time — otherwise you’ll always be playing catch up.

Now you know what to do

You may still be panicking, but hopefully this post can guide you to some constructive solutions. I find that the best response to a drop in rankings is a good explanation and a plan.

And, to the Moz community of other brilliant SEOs: comment below if you see something I’ve missed!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!