"SEO Is Always Changing"... Or Is It?: Debunking the Myth and Getting Back to Basics
Posted by bridget.randolph
Recently I made the shift to freelancing full-time, and it’s led me to participate in a few online communities for entrepreneurs, freelancers, and small business owners. I've noticed a trend in the way many of them talk about SEO; specifically, the blocks they face in attempting to “do SEO” for their businesses. Again and again, the concept that "SEO is too hard to stay on top of... it’s always changing" was being stated as a major reason that people feel a) overwhelmed by SEO; b) intimidated by SEO; and c) uninformed about SEO.
And it’s not just non-SEOs who use this phrase. The concept of “the ever-changing landscape of SEO” is common within SEO circles as well. In fact, I’ve almost certainly used this phrase myself.
But is it actually true?
To answer that question, we have to separate the theory of search engine optimization from the various tactics which we as SEO professionals spend so much time debating and testing. The more that I work with smaller businesses and individuals, the clearer it becomes to me that although the technology is always evolving and developing, and tactics (particularly those that attempt to trick Google rather than follow their guidelines) do need to adapt fairly rapidly, there are certain fundamentals of SEO that change very little over time, and which a non-specialist can easily understand.
The unchanging fundamentals of SEO
Google’s algorithm is based on an academia-inspired model of categorization and citations, which utilizes keywords as a way to decipher the topic of a page, and links from other sites (known as “backlinks”) to determine the relative authority of that site. Their method and technology keeps getting more sophisticated over time, but the principles have remained the same.
So what are these basic principles?
It comes down to answering the following questions:
- Can the search engine find your content? (Crawlability)
- How should the search engine organize and prioritize this content? (Site structure)
- What is your content about? (Keywords)
- How does the search engine know that your content provides trustworthy information about this topic? (Backlinks)
If your website is set up to help Google and other search engines answer these 4 questions, you will have covered the basic fundamentals of search engine optimization.
There is a lot more that you can do to optimize in all of these areas and beyond, but for businesses that are just starting out and/or on a tight budget, these are the baseline concepts you’ll need to know.
Crawlability
You could have the best content in the world, but it won’t drive any search traffic if the search engines can’t find it. This means that the crawlability of your site is one of the most important factors in ensuring a solid SEO foundation.
In order to find your content and rank it in the search results, a search engine needs to be able to:
- Access the content (at least the pages that you want to rank)
- Read the content
This is primarily a technical task, although it is related to having a good site structure (the next core area). You may need to adapt the code, and/or use an SEO plugin if your site runs on Wordpress.
For more in-depth guides to technical SEO and crawlability, check out the following posts:
- Find Your Site's Biggest Technical Flaws in 60 Minutes - Moz blog
- SEO Tools to Analyze Your Site Like Google Does - Hubspot blog
- What Web Dev Taught Me About SEO - Distilled blog
Site structure
In addition to making sure that your content is accessible and crawlable, it's also important to help search engines understand the hierarchy and relative importance of that content. It can be tempting to think that every page is equally important to rank, but failing to structure your site in a hierarchical way often dilutes the impact of your “money” pages. Instead, you should think about what the most important pages are, and structure the rest of your site around these.
When Google and other search engine crawlers visit a site, they attempt to navigate to the homepage; then click on every link. Googlebot assumes that the pages it sees the most are the most important pages. So when you can reach a page with a single click from the homepage, or when it is linked to on every page (for example, in a top or side navigation bar, or a site footer section), Googlebot will see those pages more, and will therefore consider them to be more important. For less important pages, you’ll still need to link to them from somewhere for search engines to be able to see them, but you don’t need to emphasize them quite as frequently or keep them as close to the homepage.
The main question to ask is: Can search engines tell what your most important pages are, just by looking at the structure of your website? Google’s goal is to to save users steps, so the easier you make it for them to find and prioritize your content, the more they’ll like it.
For more in-depth guides to good site structure, check out the following posts:
- Information Architecture for SEO - Moz (Whiteboard Friday)
- How to Create a Site Structure That Will Enhance SEO - Kissmetrics blog
- How to Create a Site Structure Google Will Love - Wordtracker
- The SEO Benefits of Developing a Solid Site Structure - Search Engine Land
Keywords
Once the content you create is accessible to crawlers, the next step is to make sure that you’re giving the search engines an accurate picture of what that content is about, to help them understand which search queries your pages would be relevant to. This is where keywords come into the mix.
We use keywords to tell the search engine what each page is about, so that they can rank our content for queries which are most relevant to our website. You might hear advice to use your keywords over and over again on a page in order to rank well. The problem with this approach is that it doesn’t always create a great experience for users, and over time Google has stopped ranking pages which it perceives as being a poor user experience.
Instead, what Google is looking for in terms of keyword usage is that you:
- Answer the questions that real people actually have about your topic
- Use the terminology that real people (specifically, your target audience) actually use to refer to your topic
- Use the term in the way that Google thinks real people use it (this is often referred to as “user intent” or “searcher intent”).
You should only ever target one primary keyword (or phrase) per page. You can include “secondary” keywords, which are related to the primary keyword directly (think category vs subcategory). I sometimes see people attempting to target too many topics with a single page, in an effort to widen the net. But it is better to separate these out so that there's a different page for each different angle on the topic.
The easiest way to think about this is in physical terms. Search engines’ methods are roughly based on the concept of library card catalogs, and so we can imagine that Google is categorizing pages in a similar way to a library using the Dewey decimal system to categorize books. You might have a book categorized as Romance, subcategory Gothic Romance; but you wouldn’t be able to categorize it as Romance and also Horror, even though it might be related to both topics. You can’t have the same physical book on 2 different shelves in 2 different sections of the library. Keyword targeting works the same way: 1 primary topic per page.
For more in-depth guides to keyword research and keyword targeting, check out the following posts:
- More than Keywords: 7 Concepts of Advanced On-Page SEO - Moz blog
- Keyword Research in 2016: Going Beyond Guesswork - Moz blog
- Guide to Keyword Research - Backlinko
- Complete Guide to Keyword Research for SEO - SearchEngineWatch
Backlinks
Another longstanding ranking factor is the number of links from other sites to your content, known as backlinks.
It’s not enough for you to say that you’re the expert in something, if no one else sees it that way. If you were looking for a new doctor, you wouldn’t just go with the guy who says “I’m the world’s best doctor.” But if a trusted friend told you that they loved their doctor and that they thought you’d like her too, you’d almost certainly make an appointment.
When other websites link to your site, it helps to answer the question: “Do other people see you as a trustworthy resource?” Google wants to provide correct and complete information to people’s queries. The more trusted your content is by others, the more that indicates the value of that information and your authority as an expert.
When Google looks at a site’s backlinks, they are effectively doing the same thing that humans do when they read reviews and testimonials to decide which product to buy, which movie to see, or which restaurant to go to for dinner. If you haven’t worked with a product or business, other people’s reviews point you to what’s good and what’s not. In Google’s case, a link from another site serves as a vote of confidence for your content.
That being said, not all backlinks are treated equally when it comes to boosting your site’s rankings. They are weighted differently according to how Google perceives the quality and authority of the site that’s doing the linking. This can feel a little confusing, but when you think about it in the context of a recommendation, it becomes a lot easier to understand whether the backlinks your site is collecting are useful or not. After all, think about the last time you saw a movie. How did you choose what to see? Maybe you checked well-known critics’ reviews, checked Rotten Tomatoes, asked friends’ opinions, looked at Netflix’s suggestions list, or saw acquaintances posting about the film on social media.
When it comes to making a decision, who do you trust? As humans, we tend to use an (often unconscious) hierarchy of trust:
- Personalized recommendation: Close friends who know me well are most likely to recommend something I’ll like;
- Expert recommendation: Professional reviewers who are authorities on the art of film are likely to have a useful opinion, although it may not always totally match my personal taste;
- Popular recommendation: If a high percentage of random people liked the movie, this might mean it has a wide appeal and will likely be a good experience for me as well;
- Negative association: If someone is raving about a movie on social media and I know that they’re a terrible human with terrible taste... well, in the absence of other positive signals, that fact might actually influence me not to see the movie.
To bring this back to SEO, you can think about backlinks as the SEO version of reviews. And the same hierarchy comes into play.
- Personalized/contextual recommendation: For local businesses or niche markets, very specific websites like a local city’s tourism site, local business directory or very in-depth, niche fan site might be the equivalent of the “best friend recommendation”. They may not be an expert in what everyone likes, but they definitely know what works for you as an individual and in some cases, that’s more valuable.
- Expert recommendation: Well-known sites with a lot of inherent trust, like the BBC or Harvard University, are like the established movie critics. Broadly speaking they are the most trustworthy, but possibly lacking the context for a specific person’s needs. In the absence of a highly targeted type of content or service, these will be your strongest links.
- Popular recommendation: All things being equal, a lot of backlinks from a lot of different sites is seen as a signal that the content is relevant and useful.
- Negative association: Links that are placed via spam tactics, that you buy in bulk, or that sit on sites that look like garbage, are the website equivalent of that terrible person whose recommendation actually turns you off the movie.
If a site collects too many links from poor-quality sites, it could look like those links were bought, rather than "earned" recommendations (similar to businesses paying people to write positive reviews). Google views the buying of links as a dishonest practice, and a way of gaming their system, and therefore if they believe that you are doing this intentionally it may trigger a penalty. Even if they don’t cause a penalty, you won’t gain any real value from poor quality links, so they’re certainly not something to aim for. Because of this, some people become very risk-averse about backlinks, even the ones that came to them naturally. But as long as you are getting links from other trustworthy sources, and these high quality links make up a substantially higher percentage of your total, having a handful of lower quality sites linking to you shouldn’t prevent you from benefiting from the high quality ones.
For more in-depth guides to backlinks, check out the following posts:
Theory of Links
- All Links are Not Created Equal: 10 Illustrations on Search Engines' Valuation of Links - Moz blog
- What Links Comply with Google's Guidelines - Moz (Whiteboard Friday)
Getting More Links
- What Is Linkbuilding? - Moz (Beginner's Guide to SEO)
- High-Value Tactics, Future-Proof Link Building - Moz (Whiteboard Friday)
- How to Create Content That Keeps Earning Links (Even After You Stop Promoting It) - Moz blog
- Targeted Link Building in 2016 - Moz (Whiteboard Friday)
- 7 Easy Local Link Building Tactics - Whitespark blog
- Guide to Linkbuilding - Backlinko
Mitigating Risk of Links
- Step-by-step Guide to a Manual Backlinks Audit - Search Engine Land
- Link Audit Guide for Effective Link Removals & Risk Mitigation - Moz blog
- How to Conduct a Backlink Audit in 45 Minutes - Neil Patel
Does anything about SEO actually change?
If SEO is really this simple, why do people talk about how it changes all the time? This is where we have to separate the theory of SEO from the tactics we use as SEO professionals to grow traffic and optimize for better rankings.
The fundamentals that we’ve covered here — crawlability, keywords, backlinks, and site structure — are the theory of SEO. But when it comes to actually making it work, you need to use tactics to optimize these areas. And this is where we see a lot of changes happening on a regular basis, because Google and the other search engines are constantly tweaking the way the algorithm understands and utilizes information from those four main areas in determining how a site’s content should rank on a results page.
The important thing to know is that, although the tactics which people use will change all the time, the goal for the search engine is always the same: to provide searchers with the information they need, as quickly and easily as possible. That means that whatever tactics and strategies you choose to pursue, the important thing is that they enable you to optimize for your main keywords, structure your site clearly, keep your site accessible, and get more backlinks from more sites, while still keeping the quality of the site and the backlinks high.
The quality test (EAT)
Because Google’s goal is to provide high-quality results, the changes that they make to the algorithm are designed to improve their ability to identify the highest quality content possible. Therefore, when tactics stop working (or worse, backfire and incur penalties), it is usually related to the fact that these tactics didn’t create high-quality outputs.
Like the fundamentals of SEO theory which we’ve already covered, the criteria that Google uses to determine whether a website or page is good quality haven’t changed all that much since the beginning. They’ve just gotten better at enforcing them. This means that you can use these criteria as a “sniff test” when considering whether a tactic is likely to be a sustainable approach long-term.
Google themselves refer to these criteria in their Search Quality Rating Guidelines with the acronym EAT, which stands for:
- Expertise
- Authoritativeness
- Trustworthiness
In order to be viewed as high-quality content (on your own site) or a high-quality link (from another site to your site), the content needs to tick at least one of these boxes.
Expertise
Does this content answer a question people have? Is it a *good* answer? Do you have a more in-depth degree of knowledge about this topic than most people?
This is why you will see people talk about Google penalizing “thin” content — that just refers to content which isn’t really worth having on its own page, because it doesn’t provide any real value to the reader.
Authority
Are you someone who is respected and cited by others who know something about this topic?
This is where the value of backlinks can come in. One way to demonstrate that you are an authority on a topic is if Google sees a lot of other reputable sources referring to your content as a source or resource.
Trust
Are you a reputable person or business? Can you be trusted to take good care of your users and their information?
Because trustworthiness is a factor in determining a site’s quality, Google has compiled a list of indicators which might mean a site is untrustworthy or spammy. These include things like a high proportion of ads to regular content, behavior that forces or manipulates users into taking actions they didn’t want to take, hiding some content and only showing it to search engines to manipulate rankings, not using a secure platform to take payment information, etc.
It’s always the same end goal
Yes, SEO can be technical, and yes, it can change rapidly. But at the end of the day, what doesn’t change is the end goal. Google and the other search engines make money through advertising, and in order to get more users to see (and click on) their ads, they have to provide a great user experience. Therefore, their goal is always going to be to give the searchers the best information they can, as easily as they can, so that people will keep using their service.
As long as you understand this, the theory of SEO is pretty straightforward. It’s just about making it easy for Google to answer these questions:
- What is your site about?
- What information does it provide?
- What service or function does it provide?
- How do we know that you’ll provide the best answer or product or service for our users’ needs?
- Does your content demonstrate Expertise, Authoritativeness, and/or Trustworthiness (EAT)?
This is why the fundamentals have changed so little, despite the fact that the industry, technology and tactics have transformed rapidly over time.
A brief caveat
My goal with this post is not to provide step-by-step instruction in how to “do SEO,” but rather to demystify the basic theory for those who find the topic too overwhelming to know where to start, or who believe that it’s too complicated to understand without years of study. With this goal in mind, I am intentionally taking a simplified and high-level perspective. This is not to dismiss the importance of an SEO expert in driving strategy and continuing to develop and maximize value from the search channel. My hope is that those business owners and entrepreneurs who currently feel overwhelmed by this topic can gain a better grasp on the way SEO works, and a greater confidence and ease in approaching their search strategy going forward.
I have provided a few in-depth resources for each of the key areas — but you will likely want to hire a specialist or consultant to assist with analysis and implementation (certainly if you want to develop your search strategy beyond simply the “table stakes” as Rand calls it, you will need a more nuanced understanding of the topic than I can provide in a single blog post).
At the end of the day, the ideas behind SEO are actually pretty simple — it’s the execution that can be more complex or simply time-consuming. That’s why it’s important to understand that theory — so that you can be more informed if and when you do decide to partner with someone who is offering that expertise. As long as you understand the basic concepts and end goal, you’ll be able to go into that process with confidence. Good luck!
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Không có nhận xét nào: