Tuesday, March 7, 2017

The Moz 2016 Annual Report

Posted by SarahBird

I have a longstanding tradition of boring Moz readers with our exhaustive annual reports (2012, 2013, 2014, 2015).

tradition fiddler.gif

If you’re avoiding sorting the recycling, going to the gym, or cleaning out your closet, I have got a *really* interesting post that needs your attention *right now*.

(Yeah. I know it’s March. But check this out, I had pneumonia in Jan/Feb so my life slid sideways for a while.)

Skip to your favorite parts:

Part 1: TL;DR

Part 2: Achievements unlocked

Part 3: Oh hai, elephant. Oh hai, room.

Part 4: More wood, fewer arrows

Part 5: Performance (metrics vomit)

Part 6: Inside Moz HQ

Part 7: Looking ahead


Part 1: TL;DR

We closed out 2016 with more customers and revenue than 2015. Our core SEO products are on a roll with frequent, impactful launches.

The year was not all butterflies and sunshine, though. Some of our initiatives failed to produce the results we needed. We made some tough calls (sunsetting some products and initiatives) and big changes (laying off a bunch of folks and reallocating resources). On a personal level, it was the most emotionally fraught time in my career.

Thank the gods, our hard work is paying off. Moz ended the year cashflow, EBITDA, and net income profitable (on a monthly basis), and with more can-do spirit than in years past. In fact, in the month of December we added a million dollars cash to the business.

We’re completely focused on our mission to simplify SEO for everyone through software, education, and community.


Part 2: Achievements unlocked

It blows my mind that we ended the year with over 36,000 customers from all over the world. We’ve got brands and agencies. We’ve got solopreneurs and Fortune 500s. We’ve got hundreds of thousands of people using the MozBar. A bunch of software companies integrate with our API. It’s humbling and awesome. We endeavor to be worthy of you!

Customers and Community.png

We were very busy last year. The pace and quality of development has never been better. The achievements captured below don’t come even close to listing everything. How many of these initiatives did you know about?


Part 3: Oh hai, elephant. Oh hai, room.

When a few really awful things happen, it can overshadow the great stuff you experience. That makes this a particularly hard annual report to write. 2016 was undoubtedly the most emotionally challenging year I’ve experienced at Moz.

It became clear that some of our strategic hypotheses were wrong. Pulling the plug on those projects and asking people I care deeply about to leave the company was heartbreaking. That’s what happened in August 2016.

Tolstoy Happy products and unhappy products.jpg

As Tolstoy wrote, “Happy products are all alike; every unhappy product is unhappy in its own way.” The hard stuff happened. Rehashing what went wrong deserves a couple chapters in a book, not a couple lines in a blog post. It shook us up hard.

And *yet*, I am determined not to let the hard stuff take away from the amazing, wonderful things we accomplished and experienced in 2016. There was a lot of good there, too.

Smarter people than me have said that progress doesn’t happen in a straight line; it zigs and zags. I’m proud of Mozzers; they rise to challenges. They lean into change and find the opportunity in it. They turn their compassion and determination up to 11. When the going gets tough, the tough get going.

beast mode q4-finish-strong.jpg

I’ve learned a lot about Moz and myself over the last year. I’m taking all those learnings with me into the next phase of Moz’s growth. Onwards.


Part 4: More wood, fewer arrows

At the start of 2016, our hypothesis was that our customers and community would purchase several inbound marketing tools from Moz, including SEO, local SEO, social analytics, and content marketing. The upside was market expansion. The downside was fewer resources to go around, and a much more complex brand and acquisition funnel.

By trimming our product lines, we could reallocate resources to initiatives showing more growth potential. We also simplified our mission, brand, and acquisition funnel.

It feels really good to be focusing on what we love: search. We want to be the best place to learn and do SEO.

Whenever someone wonders how to get found in search, we want them to go to Moz first. We aspire to be the best in the world at the core pillars of SEO: rankings, keywords, site audit and optimization, links, location data management.

SEO is dynamic and complex. By reducing our surface area, we can better achieve our goal of being the best. We’re putting more wood behind fewer arrows.

more wood fewer arrows.png


Part 5: Performance (metrics vomit)

Check out the infographic view of our data barf.

We ended the year at ~$42.6 million in gross revenue, amounting to ~12% annual growth. We had hoped for better at the start of the year. Moz Pro is still our economic engine, and Local drives new revenue and cashflow.

revenue for annual report 2016.png

Gross profit margin increased a hair to 74%, despite Moz Local being a larger share of our overall business. Product-only gross profit margin is a smidge higher at 76%. Partner relationships generally drag the profit margin on that product line.

Our Cost of Revenue (COR) went up in raw numbers from the previous year, but it didn’t increase as much as revenue.COR 2016.png

COR Pie Annual Report 2016.png

Total Operating Expenses came to about ~$41 million. Excluding the cost of the restructure we initiated in August, the shape and scale of our major expenses has remained remarkably stable.

2016 year in review major expenses.png

We landed at -$5.5 million in EBITDA, which was disappointingly below our plan. We were on target for our budgeted expenses. As we fell behind our revenue goals, it became clear we’d need to right-size our expenses to match the revenue reality. Hence, we made painful cuts.

EBITDA Annual Report 2016.png

Cash Burn Annual Report 2016.png

I’m happy/relieved/overjoyed to report that we were EBITDA positive by September, cashflow positive by October, and net income positive by November. Words can’t express how completely terrible it would have been to go through what we all went through, and *not* have achieved our business goals.

My mind was blown when we actually added a million in cash in December. I couldn’t have dared to dream that… Ha ha! They won’t all be like that! It was the confluence of a bunch of stuff, but man, it felt good.

one million dollars dr evil.jpg


Part 6: Inside MozHQ

Thanks to you, dear reader, we have a thriving and opinionated community of marketers. It’s a great privilege to host so many great exchanges of ideas. Education and community are integral to our mission. After all, we were a blog before we were a tech company. Traffic continues to climb and social keeps us busy. We love to hear from you!

organic traffic 2016 annual report.png

social channels for annual report 2016.png

We added a bunch of folks to the Moz Local, Moz.com, and Customer Success teams in the last half of the year. But our headcount is still lower than last year because we asked a lot of talented people to leave when we sunsetted a bunch of projects last August. We’re leaner, and gaining momentum.

End of year headcount bar charg 2016 annual report.png

Moz is deeply committed to making tech a more inclusive industry. My vision is for Moz to be a place where people are constantly learning and doing their best work. We took a slight step back on our gender diversity gains in 2016. Ugh. We’re not doing much hiring in 2017, so it’s going to be challenging to make substantial progress. We made a slight improvement in the ratio of underrepresented minorities working at Moz, which is a positive boost.

Gender ratios annual report 2016.png

The tech industry has earned its reputation of being unwelcoming and myopic.

Mozzers work hard to make Moz a place where anyone could thrive. Moz isn’t perfect; we’re human and we screw up sometimes. But we pick ourselves up, dust off, and try again. We continue our partnership with Ada Academy, and we’ve deepened our relationship with Year Up. One of my particular passions is partnering with programs that expose girls and young women to STEM careers, such as Ignite Worldwide, Techbridge, and BigSisters.

I’m so proud of our charitable match program. We match Mozzer donations 150% up to $3k. Over the years, we’ve given over half a million dollars to charity. In 2016, we gave over $111,028 to charities. The ‘G’ in TAGFEE stands for ‘generous,’ and this is one of the ways we show it.

charitable donation match annual report 2016.png

One of our most beloved employee benefits is paid, PAID vacation. We give every employee up to $3,000 to spend on his or her vacation. This year, we spent over half a million dollars exploring the world and sucking the marrow out of life.

paid paid vacation annual report 2016.png


Part 7: Looking ahead

Dear reader, I don’t have to tell you that search has been critical for a long time.

This juggernaut of a channel is becoming *even more* important with the proliferation of search interfaces and devices. Mobile liberated search from the desktop by bringing it into the physical world. Now, watches, home devices, and automobiles are making search ubiquitous. In a world of ambient search, SEO becomes even more important.

SEO is more complicated and dynamic than years past because the number of human interfaces, response types, and ranking signals are increasing. We here at Moz are wild about the complexity. We sink our teeth into it. It drives our mission: Simplify SEO for everyone through software, education, and community.

We’re very excited about the feature and experience improvements coming ahead. Thank you, dear reader, for sharing your feedback, inspiring us, and cheering us on. We look forward to exploring the future of search together.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, March 6, 2017

Infinite "People Also Ask" Boxes: A Glimpse at Google's Deep Learning Edges

Posted by BritneyMuller

A glimpse into Google's machine learning?

You’ve likely seen the People Also Ask (Related Questions) boxes in SERPs. These accordion-like question and answer boxes are Google’s way of saying, “Hey, you beautiful searcher, you! These questions also relate to your search... maybe you're interested in exploring these too? Kick off your shoes, stay a while!”

However, few people have come across infinite PAAs. These occur when you expand a PAA question box to see 2 or 3 other related questions appear at the bottom. These infinite PAA lists can continue into the hundreds, and I've been lucky enough to come across 75+ of these gems!

So, grab a coffee and buckle up! I’d like to take you on a journey of my infinite PAA research, discoveries, machine learning hypothesis, and how you can find PAA opportunities.

Why PAAs should matter to you

PAAs have seen a 1,723% growth in SERPs since 7/31/15 via Mozcast! ← Tweet this stat!

Compare that to featured snippets, which have seen only a 328% growth since that timeframe.

Research has also shown that a single PAA can show up in 21 unique SERPs! How 'bout dem apples?! PAA opportunities can take over some serious SERP real estate.

My infinite PAA obsession

These mini-FAQs within search results have fascinated me since Google started testing of them in 2015. Then in November 2016, I discovered Google's PAA dynamic testing:

The above infinite PAA expanded into the hundreds! This became an obsession of mine as I began to notice them across multiple devices (for a variety of different searches) and coined them “PAA Black Holes.”

I began saving data from these infinite PAAs to see if I could find any patterns, explore how Google might be pulling this data, and dive deeper into how the questions/topics changed as a result of my expanding question boxes, etc.

After seeing a couple dozen infinite PAAs, I began to wonder if this was actually a test to implement in search, but several industry leaders assured me this was more likely a bug.

They were wrong.

Infinite People Also Ask boxes are live

Now integrated into U.S. SERPs (sorry foreign friends, but get ready for this to potentially migrate your way) you can play with these on desktop & mobile:

Why does Google want people to spend more time on individual SERPs (instead of looking at several)? Could they charge more for advertisements on SERPs with these sticky, expansive PAAs? Might they eventually start putting ads in PAAs? These are the questions that follow me around like a shadow.

To get a better idea of the rise of PAAs, here's a timeline of my exploratory PAA research:

PAA timeline

April 17, 2015 - Google starts testing PAAs

July 29, 2015 - Dr. Pete gets Google to confirm preferred “Related Questions” name

Aug 15, 2015 - Google tests PAA Carousels on desktop

Dec 30, 2015 - Related Questions (PAAs) grow +500% in 5 months

Mar 11, 2016 - See another big uptick in Related Questions (PAAs) in Mozcast

Nov 11, 2016 - Robin Rozhon notices PAA Black Hole

Nov 23, 2016 - Brit notices PAA Black Hole

Nov 29, 2016 - STAT Analytics publishes a research study on PAAs

Dec 12, 2016 - Realized new PAA results would change based on expanded PAA

Dec 14, 2016 - Further proof PAAs dynamically load based on what you click

Dec 19, 2016 - Still seeing PAA Black Holes

Dec 22, 2016 - Discovered a single PAA result (not a 3-pack)

Jan 11, 2016 - Made a machine learning (TensorFlow) discovery and hypothesis!

Jan 22, 2016 - Discovered a PAA Black Hole on a phone

Jan 25, 2016 - Discovered a PAA Black Hole that maxed out at 9

Feb 10, 2017 - PAA Black Holes go live!

Feb 14, 2017 - Britney Muller is still oblivious to PAA Black Holes going live and continues to hypothesize how they are being populated via entity graph-based ML.


3 big infinite PAA discoveries:

#1 - Google caters to browsing patterns in real time

It took me a while to grasp that I can manipulate the newly populated question boxes based on what I choose to expand.

Below, I encourage more Vans-related PAAs by clicking “Can I put my vans in the washing machine?” Then, I encourage more “mildew”-related ones simply by clicking a “How do you get the mildew smell out of clothes” PAA above:

vans-paa.gif

Another example of this is when I clicked “organic SEO” at the very top of a 100+ PAA Black Hole (the gif would make you dizzy, so I took a screenshot instead). It altered my results from “how to clean leather” to “what is seo” and “what do you mean by organic search”:

leather to seo.png


#2 - There are dynamic dead ends

When I reach an exhaustive point in my PAA expansions (typically ~300+), Google will prompt the first two PAAs, as in: “We aren’t sure what else to provide, are you interested in these again?”

Here is an example of that happening: I go from “mitosis”-related PAAs (~300 PAAs deep) to a repeat of the first two PAAs: “What is Alexa ranking based on?” and “What is the use of backlinks?”:

getting sick of me.gif

This reminds me of a story told by Google machine learning engineers: whenever an early ML model couldn’t identify a photograph, it would say a default ‘I don’t know’ answer of: “Men talking on cell phone.” It could have been a picture of an elephant dancing, and if the ML model wasn’t sure what it was, it would say “Men talking on cell phone.”

My gut tells me that G reverts back to the strongest edge cases (the first two PAAs) to your original query when running out of a certain relational threshold of PAAs.

It will then suggest the third and fourth PAA when you push these limits to repeat again, and so on.


#3 - Expand & retract one question to explore the most closely related questions

This not only provides you with the most relevant PAAs to the query you're expanding and retracting, but if it’s in your wheelhouse, you can quickly discover other very relevant PAA opportunities.

Here I keep expanding and retracting "What is the definition of SEO?":

exhaust-paa.gif

Notice how “SEO” or “search engine optimization” is in every subsequent PAA!? This is no coincidence and has a lot to do with the entity graph.

First, let's better understand machine learning and why an entity-based, semi-supervised model is so relevant to search. I’ll then draw out what I think is happening with the above results (like a 5-year-old), and go over ways you can capture these opportunities! Woohoo!


Training data's role in machine learning

Mixups are commonplace in machine learning, mostly due to a lack of quality training data.

Machine Learning Process SEO

Well-labeled training data is typically the biggest component necessary in training an accurate ML model.

Fairly recently, the voice search team at Google came across an overwhelming amount of EU voice data that was interpreted as “kdkdkdkd.” An obvious exclusion in their training data (who says “kdkdkdkd”?!), the engineers had no idea what could be prompting that noise. Confused, they finally figured out that it was the trains and subways making that noise!

This is a silly example of adding the "kdkdkd" = Trains/Subways training data. Google is now able to account for these pesky "kdkdkdkd" inclusions.


Relational data to the rescue

Because we don’t always have enough training data to properly train a ML model, we look to relational data for help.

Example: If I showed you the following picture, you could gather a few things from it, right? Maybe that it appears to be a female walking down a street, and that perhaps it’s fall by her hat, scarf, and the leaves on the ground. But it’s hard to determine a whole lot else, right?

Screen Shot 2017-01-10 at 1.21.03 AM.png

What about now? Here are two other photos from the above photo’s timeline:

Screen Shot 2017-01-10 at 1.23.36 AM.pngScreen Shot 2017-01-10 at 1.23.47 AM.png

Aha! She appears to be a U.S. traveler visiting London (with her Canon Ti3 camera). Now we have some regional, demographic, and product understanding. It’s not a whole lot of extra information, but it provides much more context for the original cryptic photo, right?

Perhaps, if Google had integrated geo-relational data with their voice machine learning, they could have more quickly identified that these noises were occurring at the same geolocations. This is just an example; Google engineers are WAY smarter than myself and have surely thought of much better solutions.


Google leverages entity graphs similarly for search

Google leverages relational data (in a very similarly way to the above example) to form better understandings of digital objects to help provide the most relevant search results.

A kind of scary example of this is Google’s Expander: A large-scale ML platform to “exploit relationships between data objects.”

Screen Shot 2017-01-09 at 11.39.57 PM.png

Machine learning is typically “supervised” (training data is provided, which is more common) or “unsupervised” (no training data). Expander, however, is “semi-supervised,” meaning that it’s bridging the gap between provided and not-provided data. ← SEO pun intended!

Expander leverages a large, graph-based system to infer relationships between datasets. Ever wonder why you start getting ads about a product you started emailing your friend about?

Expander is bridging the gap between platforms to better understand online data and is only going to get better.


Relational entity graphs for search

Here is a slide from a Google I/O 2016 talk that showcases a relational word graph for search results:

Screen Shot 2017-01-09 at 11.24.47 PM.png

Slide from Breakthroughs in Machine Learning Google I/O 2016 video.

Solid edges represent stronger relationships between nodes than the dotted lines. The above example shows there is a strong relationship between “What are the traditions of halloween” and “halloween tradition,” which makes sense. People searching for either of those would each be satisfied by quality content about “halloween traditions.”

Edge strength can also be determined by distributional similarity, lexical similarity, similarity based on word embeddings, etc.


Infinite PAA machine learning hypothesis:

Google is providing additional PAAs based on the strongest relational edges to the expanded query.

You can continue to see this occur in infinite PAAs datasets. When a word with two lexical similarities overlaps the suggested PAAs, the topic changes because of it:

Screen Shot 2017-03-02 at 7.21.58 PM.png

The above topic change occurred through a series of small relational suggestions. A PAA above this screenshot was “What is SMO stands for?” (not a typo, just a neural network doing its best people!) which led to "What is the meaning of SMO?", to “What is a smo brace?” (for ankles).

This immediately made me think of the relational word graph and what I envision Google is doing:

I hope my parents hang this on their fridge.

My hypothesis is that the machine learning model computes that because I’m interested in “SMO,” I might also be interested in ankle brace “SMO.”

There are ways for SEOs and digital marketers to leverage topical relevance and capture PAAs opportunities.


4 ways to optimize for machine learning & expand your topical reach for PAAs:

Topical connections can always be made within your content, and by adding additional high quality topically related content, you can strengthen your content’s edges (and expand your SERP real estate). Here are some quick and easy ways to discover related topics:

#1: Quickly discover Related Topics via MozBar

MozBar is a free SEO browser add-on that allows you to do quick SEO analysis of web pages and SERPs. The On-Page Content Suggestions feature is a quick and simple way to find other topics related to your page.

Step 1: Activate MozBar on the page you are trying to expand your keyword reach with, and click the Page Optimization:

Beginner's Guide To SEO Mozbar.png

Step 2: Enter in the word you are trying to expand your keyword reach with:

seo-browser-tool.png

Step 3: Click On-Page Content Suggestions for your full list of related keyword topics.

seo-toolbar.png

Step 4: Evaluate which related keywords can be incorporated naturally into your current on-page content. In this case, it would be beneficial to incorporate “seo tutorial,” “seo tools,” and “seo strategy” into the Beginner’s Guide to SEO.

mozbar related keywords seo.png

Step 5: Some may seem like an awkward add to the page, like “seo services” and “search engine ranking,” but are relevant to the products/services that you offer. Try adding these topics to a better-fit page, creating a new page, or putting together a strong FAQ with other topically related questions.


#2: Wikipedia page + SEOBook Keyword Density Checker*

Let’s say you're trying to expand your topical keywords in an industry you’re not very familiar with, like "roof repair." You can use this free hack to pull in frequent and related topics.

Step 1: Find and copy the roof Wikipedia page URL.

Step 2: Paste the URL into SEOBook’s Keyword Density Checker:

Screen Shot 2017-02-13 at 6.15.51 AM.png

Step 3: Hit submit and view the most commonly used words on the Wikipedia page:

Screen Shot 2017-02-13 at 6.12.59 AM.png

Step 4: You can dive even deeper (and often more topically related) by clicking on the "Links" tab to evaluate the anchor text of on-page Wikipedia links. If a subtopic is important enough, it will likely have another page to link to:

keyword density links.png

Step 5: Use any appropriate keyword discoveries to create stronger topic-based content ideas.

*This tactic was mentioned in Experts On The Wire episode on keyword research tools.


#3: Answer the Public

Answer the Public is a great free resource to discover questions around a particular topic. Just remember to change your country if you’re not seeking results from the UK (the default).

Step 1: Enter in your keyword/topic and select your country:

Screen Shot 2017-02-13 at 6.39.14 AM.png

Step 2: Explore the visualization of questions people are asking about your keyword:

Screen Shot 2017-02-13 at 6.40.12 AM.png

Doesn’t this person look like they’re admiring themselves in a mirror (or taking a selfie)? A magnifying glass doesn’t work from that distance, people!

Note: Not all questions will be relevant to your research, like “why roof of mouth hurts” and “why roof of mouth itches.”Screen Shot 2017-02-13 at 6.40.43 AM.png

Step 3: Scroll back up to the top to export the data to CSV by clicking the big yellow button (top right corner):

Screen Shot 2017-02-13 at 12.32.56 PM.png

The magnifying glass looks much larger here... perhaps it would work at that distance?

Step 4: Clean up the data and upload the queries to your favorite keyword research tool (Moz Keyword Explorer, SEMRush, Google Keyword Planner, etc.) to discover search volume and SERP feature data, like featured snippets, reviews, related questions (PAA boxes), etc.

Note: Google’s Keyword Planner does not support SERP features data and provides vague, bucket-based search volume.


#4: Keyword research “only questions”

Moz Keyword Explorer provides an “only questions” filter to uncover potential PAA opportunities.

Step 1: Enter your keyword into KWE:

best keyword research tool.png

Step 2: Click Keyword Suggestions:

Screen Shot 2017-02-13 at 12.53.26 PM.png

Step 3: Filter by “are questions”:

Screen Shot 2017-02-13 at 12.51.59 PM.png

Pro tip: Find grouped question keyword opportunities by grouping keywords by “low lexical similarity” and ordering them from highest search volume to lowest:

Screen Shot 2017-02-13 at 12.52.28 PM.png

Step 4: Select keywords and add to a new or previous list:

Screen Shot 2017-02-13 at 1.14.49 PM.png

Step 5: Once in a list, KWE will tell you how many “related questions” (People Also Ask boxes) opportunities are within your list. In this case, we have 18:

Screen Shot 2017-02-13 at 1.27.03 PM.png

Step 6: Export your keyword list to a Campaign in Moz Pro:

Screen Shot 2017-02-13 at 1.33.15 PM.png

Step 7: Filter SERP Features by “Related Questions” to view PAA box opportunities:

Screen Shot 2017-02-13 at 1.35.02 PM.png

Step 8: Explore current PAA box opportunities and evaluate where you currently rank for “Related Questions” keywords. If you’re on page 1, you have a better chance of stealing a PAA box.

+Evaluate what other SERP features are present on these SERPs. Here, Dr. Pete tells me that I might be able to get a reviews rich snippet for “gutter installation”. Thanks, Dr. Pete!

Screen Shot 2017-02-13 at 1.36.02 PM.png

Hopefully, this research can help energize you to do topical research of your own to grab some relevant PAAs! PAAs aren't going away anytime soon and I'm so excited for us to learn more about them.

Please share your PAA experiences, questions, or comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Friday, March 3, 2017

Aren't 301s, 302s, and Canonicals All Basically the Same? - Whiteboard Friday

Posted by Dr-Pete

They say history repeats itself. In the case of the great 301 vs 302 vs rel=canonical debate, it repeats itself about every three months. In today's Whiteboard Friday, Dr. Pete explains how bots and humans experience pages differently depending on which solution you use, why it matters, and how each choice may be treated by Google.

Aren't 301s, 302s, and canonicals all basically the same?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans, it's Dr. Pete, your friendly neighborhood marketing scientist here at Moz, and I want to talk today about an issue that comes up probably about every three months since the beginning of SEO history. It's a question that looks something like this: Aren't 301s, 302s, and canonicals all basically the same?

So if you're busy and you need the short answer, it's, "No, they're not." But you may want the more nuanced approach. This popped up again about a week [month] ago, because John Mueller on the Webmaster Team at Google had posted about redirection for secure sites, and in it someone had said, "Oh, wait, 302s don't pass PageRank."

John said, "No. That's a myth. It's incorrect that 302s don't pass PR," which is a very short answer to a very long, technical question. So SEOs, of course, jumped on that, and it turned into, "301s and 302s are the same, cats are dogs, cakes are pie, up is down." We all did our freakout that happens four times a year.

So I want to get into why this is a difficult question, why these things are important, why they are different, and why they're different not just from a technical SEO perspective, but from the intent and why that matters.

I've talked to John a little bit. I'm not going to put words in his mouth, but I think 95% of this will be approved, and if you want to ask him, that's okay afterwards too.

Why is this such a difficult question?

So let's talk a little bit about classic 301, 302. So a 301 redirect situation is what we call a permanent redirect. What we're trying to accomplish is something like this. We have an old URL, URL A, and let's say for example a couple years ago Moz moved our entire site from seomoz.org to moz.com. That was a permanent change, and so we wanted to tell Google two things and all bots and browsers:

  1. First of all, send the people to the new URL, and, second,
  2. pass all the signals. All these equity, PR, ranking signals, whatever you want to call them, authority, that should go to the new page as well.

So people and bots should both end up on this new page.

A classic 302 situation is something like a one-day sale. So what we're saying is for some reason we have this main page with the product. We can't put the sale information on that page. We need a new URL. Maybe it's our CMS, maybe it's a political thing, doesn't matter. So we want to do a 302, a temporary redirect that says, "Hey, you know what? All the signals, all the ranking signals, the PR, for Google's sake keep the old page. That's the main one. But send people to this other page just for a couple of days, and then we're going to take that away."

So these do two different things. One of these tells the bots, "Hey, this is the new home," and the other one tells it, "Hey, stick around here. This is going to come back, but we want people to see the new thing."

So I think sometimes Google interprets our meaning and can change things around, and we get frustrated because we go, "Why are they doing that? Why don't they just listen to our signals?"

Why are these differentiations important?

The problem is this. In the real world, we end up with things like this, we have page W that 301s to page T that 302s to page F and page F rel=canonicals back to page W, and Google reads this and says, "W, T, F." What do we do?

We sent bad signals. We've done something that just doesn't make sense, and Google is forced to interpret us, and that's a very difficult thing. We do a lot of strange things. We'll set up 302s because that's what's in our CMS, that's what's easy in an Apache rewrite file. We forget to change it to a 301. Our devs don't know the difference, and so we end up with a lot of ambiguous situations, a lot of mixed signals, and Google is trying to help us. Sometimes they don't help us very well, but they just run into these problems a lot.

In this case, the bots have no idea where to go. The people are going to end up on that last page, but the bots are going to have to choose, and they're probably going to choose badly because our intent isn't clear.

How are 301s, 302s, and rel=canonical different?

So there are a couple situations I want to cover, because I think they're fairly common and I want to show that this is complex. Google can interpret, but there are some reasons and there's some rhyme or reason.

1. Long-term 302s may be treated as 301s.

So the first one is that long-term 302s are probably going to be treated as 301s. They don't make any sense. If you set up a 302 and you leave it for six months, Google is going to look at that and say, "You know what? I think you meant this to be permanent and you made a mistake. We're going to pass ranking signals, and we're going to send people to page B." I think that generally makes sense.

Some types of 302s just don't make sense at all. So if you're migrating from non-secure to secure, from HTTP to HTTPS and you set up a 302, that's a signal that doesn't quite make sense. Why would you temporarily migrate? This is probably a permanent choice, and so in that case, and this is actually what John was addressing in this post originally, in that case Google is probably going to look at that and say, "You know what? I think you meant 301s here," and they're going to pass signals to the secure version. We know they prefer that anyway, so they're going to make that choice for you.

If you're confused about where the signals are going, then look at the page that's ranking, because in most cases the page that Google chooses to rank is the one that's getting the ranking signals. It's the one that's getting the PR and the authority.

So if you have a case like this, a 302, and you leave it up permanently and you start to see that Page B is the one that's being indexed and ranking, then Page B is probably the one that's getting the ranking signals. So Google has interpreted this as a 301. If you leave a 302 up for six months and you see that Google is still taking people to Page A, then Page A is probably where the ranking signals are going.

So that can give you an indicator of what their decision is. It's a little hard to reverse that. But if you've left a 302 in place for six months, then I think you have to ask yourself, "What was my intent? What am I trying to accomplish here?"

Part of the problem with this is that when we ask this question, "Aren't 302s, 301s, canonicals all basically the same?" what we're really implying is, "Aren't they the same for SEO?" I think this is a legitimate but very dangerous question, because, yes, we need to know how the signals are passed and, yes, Google may pass ranking signals through any of these things. But for people they're very different, and this is important.

2. Rel=canonical is for bots, not people.


So I want to talk about rel=canonical briefly because rel=canonical is a bit different. We have Page A and Page B again, and we're going to canonical from Page A to Page B. What we're basically saying with this is, "Look, I want you, the bots, to consider Page B to be the main page. You know, for some reason I have to have these near duplicates. I have to have these other copies. But this is the main one. This is what I want to rank. But I want people to stay on Page A."

So this is entirely different from a 301 where I want people and bots to go to Page B. That's different from a 302, where I'm going to try to keep the bots where they are, but send people over here.

So take it from a user perspective. I have had in Q&A all the time people say, "Well, I've heard that rel=canonical passes ranking signals. Which should I choose? Should I choose that or 301? What's better for SEO?"

That's true. We do think it generally passes ranking signals, but for SEO is a bad question, because these are completely different user experiences, and either you're going to want people to stay on Page A or you're going to want people to go to Page B.

Why this matters, both for bots and for people

So I just want you to keep in mind, when you look at these three things, it's true that 302s can pass PR. But if you're in a situation where you want a permanent redirect, you want people to go to Page B, you want bots to go to Page B, you want Page B to rank, use the right signal. Don't confuse Google. They may make bad choices. Some of your 302s may be treated as 301s. It doesn't make them the same, and a rel=canonical is a very, very different situation that essentially leaves people behind and sends bots ahead.

So keep in mind what your use case actually is, keep in mind what your goals are, and don't get over-focused on the ranking signals themselves or the SEO uses because all off these three things have different purposes.

So I hope that makes sense. If you have any questions or comments or you've seen anything weird actually happen on Google, please let us know and I'll be happy to address that. And until then, we'll see you next week.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!