Ranking factors are awesome—and sometimes—a little bit dangerous.
That is, chasing minor ranking factors can lead to a dangerous waste resources, while at the same time neglecting holistic SEO that actually leads to higher rankings and better traffic.
Google uses 1000s of signals to rank web pages. Nobody knows what they all are (and anyone who claims otherwise is fibbing). In fact, since the rise of machine learning, not even Googlers can tell you all the elements that influence rankings and how they interact with one another in search results.
What matters is success.
SEO Success Factors are those elements you can take action on to improve your rankings, traffic, and visibility in Google search. Many Success Factors—but not all—are based on ranking factors, and a number of them deliver bigger results than others.
As an example, while meta descriptions aren’t a Google ranking factor, crafting well-composed descriptions that improve your click-through rate (CTR) can have a positive, outsized impact on your SEO efforts.
Using a combined analysis of dozens of sources, including:
Ranking factors studies
SEO Experimentation data
Expert opinion surveys
And statements from Google
… we’ve aggregated the most popular SEO Success Factors in one place. Our goal is to show you not only what works, but how to use these factors to improve your rankings and traffic.
Use this information is to prioritize the most critical factors first while working your way towards less important elements.
10 Critical SEO Success Factors 2021
Sites at the top of Google search results typically score well in most, if not all, of these critical SEO success factors.
Content that Targets User Search Queries
Crawlable + Accessible to Search Engines
Quality & Quantity of Links
Satisfies User Intent
Uniqueness of Content
Expertise, Authoritativeness, and Trustworthiness (EAT)
Click-through Rate (CTR)
Built for Multiple Devices
1. Content that Targets User Search Queries
In essence, this is the very heart of SEO. The magic that makes it work.
Instead of traditional marketing where you push your message upon a user (think television commercials), SEO allows you to deliver exactly the content the user is searching for, at exactly the moment they search for it.
To make this happen, you need data on what users are searching. This is why almost all SEO starts with Keyword Research. Keyword Research takes many forms, but it typically consists of 3 types of data:
Query: What are people searching for?
Volume: How many are searching for it?
Difficulty: How competitive is it to rank for this term in Google?
Once you know what terms are worth pursuing, you can create content that targets those topics. We’ll cover this in later success factors.
How to Leverage
There are 100s of guides on getting started with keyword research. A few of the best include:
When Googlers Larry Page and Sergey Brin wrote the original PageRank patent in 1996, they had a novel idea: Instead of ranking web pages based on human editors, why not rank pages based on the number of links pointing at them from other web pages?
Links are votes.
Today’s Google goes far beyond simply counting the raw number of links a site receives. Factors that play into this popularity contest are believed to include factors like trust, relevancy, position, and many more.
And while internal links (links from your own website) may not be as powerful as links from trusted external sites, internal linking and site architectureplay an outsized role in your SEO efforts.
Although many speculate that the power of links has declined, or that Google may someday drop them altogether, Google statements and manyexperiments continue to prove their value as one of the strongest SEO success factors that we know.
A link is more than a navigational element. Each individual link on the web conveys multiple signals that Google can interpret for ranking purposes. These include:
The Authority + Trust of the Link
The relevance of the Link
How to Leverage
If you want to rank, you need good links—from both inside and outside of your site.
Targeted content with relevant user search queries
Made your site crawlable and accessible to search engines
Obtained relevant links pointing to your content
But now, perhaps one of the biggest questions of all: Does this content satisfy user’s intent?
It’s not enough to target your content with keywords and phrases (more on this later). The important question Google wants to know is “does the content give the user the most satisfying answer they are looking for?”
Google doesn’t want to simply deliver answers to users; they want to deliver the best answers and experiences, ones that satisfy user intent without requiring additional searches.
SEOs refer to this in different ways: dwell time, the long click, return-to-SERP, and more. In essence, they all mean this: Does the user find the most satisfying answer to what they are looking for without searching further?
If the user has to click the back button, modify their search, or spend more time with results from other websites, this may be a sign that your content doesn’t deliver the best experience.
How to Leverage
Delivering content that satisfies user intent is one of the most challenging aspects of SEO, in part because it’s difficult to measure. That said, there are a number of practices that can improve your chances significantly.
Deliver content with the format and features Google expects. E.g., if the top ranking sites for your keyword in Google all contain video results, it’s a good indication that users (and Google) are looking for videos to satisfy intent
Answer the query as completely as possible, giving the user zero excuses to hit the back button. A good way to do this is to incorporate the answers to additional questions, i.e. “people also ask” into your content.
If your content is exactly the same as multiple other copies on the web, why should Google rank it above all the others?
Put another way—if your content isn’t uniquely valuable, it doesn’t mean Google will necessarily punish you for it, but it does make it much, much harder to rise to the top.
When your content isn’t unique, two things happen:
Google has to filter out all the duplicate content to deliver the best result – and there’s a good chance your content will be in the filtered group.
Duplicate content can’t target those unique topics and answers that the competition isn’t targeting
Duplicate content issues generally take two forms. First is content that actually copies content from another site or page. The second is caused by duplicates of your own content when 2 or more URLs create the same (or very close) content.
Content that directly copies content from another site is obviously problematic. But also consider the problems created with your own original content when you have URLs like these:
In theory, both of these URLs may create the exact same page. Not only does Google need to crawl each one (which could waste precious crawl budget) but the two pages may split link equity and other ranking signals. This makes it very difficult for Google to decide which page—if any at all—to show in search results.
How to Leverage
Make sure your content, including all text, offers unique value from all other sites and pages across the internet.
Control duplicate content on your own site. Tools and techniques include canonical tags, parameter handling, robots.txt, redirects, and more. These guides should help:
6. Expertise, Authoritativeness, and Trustworthiness (EAT)
You can target keywords, build links, and maybe even satisfy user intent – but do you have a site both Google and users should trust?
In the earlier days of SEO, it was easier to game the search engines, and low-quality pages flooded the results. To combat this, Google introduced Panda into its algorithm, which uses machine learning to separate high and low-quality pages.
According to Google’s Search Quality Evaluator Guidelines, the principal qualities that define a high-quality page are Expertise, Authoritativeness, and Trustworthiness (EAT). It’s believed that Google uses the results from its human quality raters as training data in its machine learning algorithms.
Other qualities that define a high-quality page—according to evaluator guidelines—include:
A satisfying amount of high-quality content
A clear indication of who is responsible for the content, i.e., author information, “About Us” and contact information
A positive reputation
How to Leverage
A good place to start is Google’s own published Panda questions, which many assume form the basis of their machine learning model.
Since you likely don’t have an army of Search Quality Raters at your disposal, Distilled created a helpful Panda Survey that you can use with your own test group.
Does fresh equal success?
Since the early days, Google has filed patent after patent on evaluating content freshness for its search results. This typically involves two parts.
First, Google devised a system known as “Query Deserves Freshness” to determine the types of search terms that most benefit from fresh results. Examples include:
News and current events – “Seattle protest” “Grammy Awards”
Recurring events – “Full moon” “World Cup Schedule”
Frequently Updated – “iPhone Specs” “Mac Reviews”
Topics with a recent spike in search volume or social media coverage
Other systems determine how fresh and relevant your content is for the search query. Signals which may indicate fresh content include:
Age of the content
Updates to the content, including which sections were updated (important or unimportant) and how much content was updated
How frequently your content is updated
How often people link to your content (lower link frequency may indicate content that has grown stale)
Changes in engagement metrics (worsening engagement may mean content is out of date)
Fresh content isn’t always better, but the idea is to deliver the most relevant content.
How to Leverage
Updating your content for the sake of freshness isn’t necessary, but keeping your content relevant and up-to-date is. Updating old content is also a way to earn new links and fresh engagement. A couple of resources to help:
For ages, SEOs and Googlers have debated if Click-through Rate is a ranking factor.
On one hand, SEO experiments show that when more people click your result in Google search, then your rankings typically rise. Google, on the other hand, usually argues that CTR would be a “noisy” signal to use for ranking (without actually denying it.)
The truth? It doesn’t matter.
While there may be convincing evidence that earning higher a Click-through rate may improve your rankings, it simply doesn’t matter if Google uses it as a direct—or even indirect—ranking signal.
The truth is that when you improve your CTR, you get more traffic. Period.
There are potential downstream benefits as well. A higher CTR means more people are looking at your content, which means more potential shares, more potential links, and more potential opportunities for engagement. All of these may have a positive influence on your Google rankings, either directly or indirectly.
How to Leverage
Fortunately, influencing CTR is one of the areas you have the most control over as an SEO. That is, at least you have the opportunity to experiment!
The three main tools you have to help raise CTR are:
The good news is that all of these are under your direct control, at least partially. Google may choose to display your information however they want in search results, but you can greatly influence this with your copy and structured data markup.
For many, improving your page speed can be a technically challenging experience. Plus, there are so many factors (time to first byte, waterfalls, download time, etc.) it can be difficult to know what to optimize for.
Another easy option for improving speed is utilizing a Content Delivery Network (CDN) which can speed up delivery of images and files to your users. Cloudflare offers a highly recommended starter plan which is both free and easy.
10. Built for Multiple Devices
How big is the device you are reading this on?
Whether it’s a desktop computer, tablet, or phone, hopefully everything is formatted okay and you can read it with ease.
Today, most modern content is built to be multi-device and mobile-friendly, but this factor can still play a huge in influencing your SEO performance.
Yes, mobile-friendliness is an actual Google ranking factor. But more than that, your mobile (and desktop) design can influence engagement, sharing, satisfaction, and quality metrics. A poor mobile experience can have a negative cascading effect on your rankings and visibility down the line.
Likewise, a positive experience can help.
Google recently rolled out Mobile First Indexing. The upshot is that Google will now mostly rank your pages based on your mobile site. So if your mobile site is missing content from your desktop site, you may be in trouble.
How to Leverage
Put simply: at a minimum, you should have a mobile-friendly site. Most modern platforms today do this by default with ease, but some older and many custom Content Management Systems may struggle.
Additionally, with Google’s Mobile First Index, you should make sure your mobile site doesn’t strip out important content.
Not only are the following factors important, but combined, they are hugely important. It’s possible not every factor will apply to your site, but the vast majority will have a big influence your rankings and visibility.
While there no single “right” way to design and organize a page, a handful of guidelines go a long way in helping to get your message across, and to rank.
Prominent main content – typically above the fold
Helpful supplemental content
Clean UX and design
Legible Text (at least 16px)
2. Use of Relevant Keywords, Phrases, & Topics
If you want to target users who search using specific words and phrases, it’s 100% helpful to include those phrases and topics in key sections of your site.
In the early days, “keyword stuffing” might have been your entire SEO strategy. Today, the importance of individual keywords has fallen dramatically while Google has gotten amazingly better at figuring out the meaning of your content, including synonyms, variations, topics, entities, and intent.
Keyword stuffing is out, but it’s still helpful to include phrases, synonyms, and topics that your audience is looking for in key places. These areas include:
It’s totally possible to rank in Google without using images or videos.
That said, sites that use different forms of visual media, on average, typically outrank sites that don’t.
Images and media can make content more engaging and shareable, and the data they add through optimization can be used by search engines to help with rankings for relevant terms.
How to Leverage
Aside from adding original and unique images and video in your text, it’s important to ensure the media both ads to the user experience, as well as follows basic optimization guidelines. Here are a couple of guides covering both video and image SEO:
In 2009, Google turned on personalized search for everyone.
If you visit a site, and Google thinks you had a good experience, they are more likely to surface that site again in your search results.
Other factors that influence personalization include:
Mobile App use
… and many more (rumored to be over 2000+ signals)
The basics of personalization are two-fold:
The more a user interacts with your website + brand, the more likely Google may surface you in the users search results
The more relevant you are to a user’s immediate need (i.e., location + intent), the more likely for Google to connect you
How to Leverage
For websites, increasing engagement through all available channels is the key to increasing visibility through personalized search. This means positively engaging your visitors/customers through all points of possible contact including your website, emails, app installs, and even your physical location, when applicable.
To put it simply: make sure they want to engage with you again!
Link Authority may encompass many things. While we don’t know all the ingredients in Google’s secret sauce, Link Authority is mostly thought of as the raw power all the links pointing to a page.
The more high-quality links a page has pointed to it, the more authority it has to pass to other pages through links. For example, Wikipedia has significantly more link authority than a typical personal blog.
Trust is another aspect of authority worth considering. A spam website may have a ton of links pointing at it (through manipulation), but because Google wants to factor out spam, it can take the trust of the website into consideration.
It may do this by starting with a set of trusted seed sites (government, universities, newspapers, etc.) and measuring the number of link “hops” between sites. Sites further away from trusted sites are more likely not to be trusted.
How to Leverage
Many types of links are valuable, but generally, you want to seek links from trusted, high authority sites.
Many SEO tool providers offer link metrics. Both Moz and Majestic offer link metrics (although limited at the free level)
Majestic: Important metrics are Trust Flow and Citation Flow
If you want to watch SEOs argue, bring up social sharing as a ranking factor.
But we’re not going to argue today because this isn’t about ranking factors. This is about success factors.
On one hand, Google has consistently stated they don’t use raw social counts in their ranking algorithms. On the other hand, study after study shows a strong correlation between social sharing and higher rankings.
It doesn’t matter how it works. It’s widely agreed that broad social sharing can help a page to rank, either directly or—more likely–indirectly.
The reason is simple: regardless of algorithms, social sharing can put your content in front of more eyeballs. Marketing 101. Content with wide social sharing enjoys greater distribution in front of influencers, and social sharing aggregators can often create downstream links to the page.
How to Leverage
No magic bullets here. They key is to build your social influence and create content that encourages social sharing. It’s not a strategy everyone can master, but it’s powerful.
8. Page Structure
Similar to Layout and Design, Page Structure helps you to organize your content in ways that Google can better understand it and display in search results.
How to Leverage
Best practices for Page Structure include:
Organize your content into clear hierarchy of titles, headings (h1, h2, h3, etc.), body text, lists and tables
Use subheadings when called for
Group thematically related ideas together, i.e., within the same content blog, paragraph, list, etc
SEOs have been debating this for years, and empathetically stress that content doesn’t need to belong to rank. Case in point: Consider this page from the US Government, which ranks for 1000s of searches a month. It doesn’t need extra content because it satisfies the user intent.
On the other hand—all things being equal—longer content tends to rank higher. Nearly every SEO correlation studies ever performed confirm this.
Why? Longer content typically:
May have a better shot of showing topical relevance
Has a better chance of containing content that satisfies the user’s query
Is likely to contain more images and video
On average, tends to earn more links and shares than shorter content
How to Leverage
In reality, it’s not the length of your content, but the depth of your content that matters. Content that fully explores a topic, including related questions, is likely to earn more visibility that content that only lightly covers a subject.
10. Topical Authority
It’s hard—typically—to rank for a competitive search query with only a single page website.
Sites that do well in Google search results tend to publish regularly in specific areas of expertise over time, building content, links, and authority.
We don’t know if Google uses a specific authority metric in its algorithm, but several possible mechanisms may make this concept work:
Sites that rank well for a particular topic often find it much easier to rank new content on that same topic, even with few external links.
How to Leverage
Become an authority on a topic by covering the topic thoroughly over time, creating multiple posts/pages, and earning topically relevant links.
Organize your topic using smart URL structures, i.e., if you were writing about “link building,” place all your link building post in the /link-building/ folder, so that Google can associate these posts together.
While Structured Data isn’t an explicit Google ranking factor, it can add content+context to your page to not only help it to rank, but also become more visible in search results.
Structured Data is the content behind your content, and provides search engines with explicit information as to what your content is about.
For example, Product Schema is extremely popular in eCommerce. With Product Schema, you can define aspects of your product such as price, photos, and availability, so that there is absolutely no confusion to search engines. Other popular schemas include:
WebPage / Website
Rating / Review
The SEO agency Distilled has repeatedly shown through testing how adding various types of structured data has helped in rankings. (It doesn’t help in all cases, but often does.)
Aside from adding clarity to your content, the other way Structured Data improves your visibility is by triggering Rich Snippets in search results, which we covered earlier.
How to Leverage
First of all, if you are unfamiliar with Structured Data, how to implement it, or the types of content you should mark up, here are a couple of resources to get you started:
If you serve various versions of your website across different countries or languages, hreflang implementation is a must.
Hreflang doesn’t necessarily boost your rankings (though many report that it does exactly this in many situations) but it helps Google to show the right content to the right audience, which can have a profound impact on your search performance.
Check out these case studies, all with search visibility improvements:
Hreflang isn’t appropriate to all circumstances (i.e., you only target web content to a single language and country) but if you want to expand across dialects and borders, it offers a great opportunity.
How to Leverage
Hreflang implementation is straightforward, but the specifics can be daunting even to the most experienced SEO. After you get everything set up, be sure to test using the Sistrix tool below, and checking Google Search Console for hreflang errors.
It sounds like a chicken and egg scenario, but it’s actually much more than that: When more people search for your website and/or brand, you tend to show up more frequently in search results.
There are several different ways searchers can search for your brand and associate it with specific keywords:
Brand + Keyword: nike running shoes
Brand + Navigation: running shoes nike.com
Many marketers have wisely observed that when the volume of these branded searches increases, the brand tends to show up even when searchers aren’t specifically looking for it.
Part of the reason, many SEOs believe, is that brand searches create an entity relationship between the brand and the keyword, i.e., “nike” is an entity associated with “running shoes,” so it tends to show up more when running shoes are searched.
The other reason is much more obvious: brand searches influence Google’s auto-suggest. The more people search for “running shoes nike,” the more likely Google will suggest that exact phrase when people search for “running shoes.”
How to Leverage
Building your brand so that searchers actually seek you out by name (and don’t simply stumble upon you in search results) is one of the strongest drivers of traffic there is. It also helps build a defensive “moat” around your business. The more people looking for your website, the less likely for Google to push down your site in rankings.
Building up your brand is hard work, but there are clever shortcuts many savvy marketers have tried. For example:
A television commercial that encourages viewers to search Google for “Chicken by Bob” can create branded search around those terms.
An email campaign that includes a link to a specific Google search containing the brand + the keyword.
Google likes to say that Accelerated Mobile Pages (AMP) isn’t a ranking factor. And SEOs equally like to point out that AMP is indeed a visibility factor.
The primary advantages that AMP affords are:
Pages become eligible for Google News Carousels and other features
Faster load times, which can improve engagement
Mobile search results are flagged with the AMP symbol, which can increase CTR
For publishers, these advantages can add up quickly. A Google sponsored Forrester study showed a 10% year-over-year increase in traffic and 60% increase in pages per visit for AMP sites.
AMP, despite the advantages, isn’t for everyone. Publishers may find that the technical standards are hard to maintain, and some non-news sites may find that any traffic gains aren’t worth the trouble.
How to Leverage
Setting up AMP can be tricky for smaller sites, but available tools get better each month. Check out these resources:
Even so, the SEO benefit from HTTPS is bigger than the sum of its parts.
Part of the reason is the way browsers display HTTPS sites as secure. Currently, most browsers grace secure sites with a trusted green bar, while Chrome aggressively labels certain pages “insecure” if they don’t have HTTPS.
Going forward, browsers may do away with green bars altogether and stated plans to issue warnings for all non-secure pages. In this case, non-HTTPS pages might soon be considered a negative ranking factor by default.
For now, it’s a good idea to implement HTTPS.
How to Leverage
Fortunately, making your site secure through HTTPS is easier than ever. Many web hosts even offer it free through services like Let’s Encrypt and Cloudflare.
20 Influential Factors
Influential success factors aren’t always required for successful SEO, but they can give you the edge over the search competition. Frequently, they make a huge difference.
User Friendly URLs
Quality of Supplemental Content
Real World Business Information
Linking to High-Quality Sources
Website & Business Reputation
Use of Semantic Content
Multiple Sources of Traffic
Accurate & Consistent Registration Info
1. User-Friendly URLs
Good URLs can make a big difference in all areas of SEO: crawling, ranking, CTR, and sharing. A few rules for URLs that SEOs have found true through testing and performance:
Shorter URLs > Longer URLs
Keywords in URLs help, but not too many
1-2 folder levels, tops
Generally Avoid: Numbers, special characters, and keep parameters to a minimum
Google provides a specialized tool within Search Console to deal with URL parameters. While the URL Parameter Tool is very powerful, configuring it correctly can be confusing for anyone other than expert users. For many marketers, it’s simply easier and safer to control parameters using rel=canonical.
3. Quality of Supplemental Content
Main Content is the part of a webpage that focuses on the main goal of the page. It can be a blog, product, news article, video or other content.
When we think of “content,” we typically think of Main Content.
Supplemental Content is the extra part of a webpage that is outside or only related to the main goal of the page.
Google’s Search Quality Evaluators Guidelines spend a lot of time discussing Supplemental Content. The quality of Supplemental Content can have an impact on user experience and how Google evaluates the page.
Low-quality Supplemental Content—such as annoying ads and poor navigation—work to frustrate the user, is often unrelated to the Main Content, and can lower engagement.
Good Supplemental Content adds value for the user, encourages exploration, and improves engagement. This can include:
4. Real World Business Information
Websites associated with verifiable, real-world businesses often have a leg up in Google Search Results. There are a number of reasons for this:
While robots.txt is often the first choice to bend bots to your will, it’s also a very blunt tool that lacks the refinement of robots meta directives, canonical tags, HTTP headers, or other methods of robots control.
6. Link Velocity
A common scenario: You build and launch your excellent content, earn a bunch of links and sail into rankings and traffic glory. Then, over time, your rankings start to slide. Eventually, even though you have far more links than your competitors, your content is relegated to the 3rd page of Google.
Link Velocity is often thought of as a Freshness Factor. The concept is the rate at which sites link to you can indicate how fresh and relevant your content is.
If your link velocity slows down or stops altogether, your content may no longer be fresh or worthy of ranking. If your link velocity speeds up over time, it may indicate your content deserves to be pushed up in search results.
While sitemaps probably likely an actual ranking factor, using sitemaps effectively can play a role in SEO success, especially for larger sites.
Sitemaps can help Google find and prioritize content on your site, in ways that may help them to do it faster or more efficiently than they otherwise would.
Aside from cataloging the important pages on your site, sitemaps can also highlight specialized content, including:
Alternate language pages (hreflang)
While most people think of XML sitemaps, HTML sitemaps can add additional value. For a cool example, check out the New York Times HTML sitemap.
8. Linking Out to High-Quality Sources
In the past, Google has both indicated that external links are a ranking factor…
“In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.” – Matt Cutts
While at other times, Google implies something more opaque:
“Our point of view, external links to other sites, so links from your site to other people’s sites isn’t specifically a ranking factor. But it can bring value to your content and that in turn can be relevant for us in search. And whether or not they are not followed, doesn’t really matter.” – John Mueller
Sometimes, SEOs are scared of linking out, fearing they’ll transfer authority elsewhere. The truth is, Google wants to reward sites that offer good experiences.
9. Site Architecture
Site Architecture in SEO refers to the organization of your website, navigation, and how the pages are linked together. An example would be “Homepage > Categories > Products” while defining how all of these different elements link to one another.
A clear Site Architecture not only helps Google with crawling, but also helps distribute topical link authority while helping users with navigation. All of these elements are important.
Cross-linking to related categories/products/pages
Flat architecture: No more than three clicks to the deepest level
Pagination is another aspect with assists Google with crawling and indexing of your site, though it is not itself a direct ranking factor.
When you have multiple pages that run in sequence—such as category pages that list products or blog posts—proper pagination setup can help search engines to find all the relevant content, and signal that these pages are related to one another.
Pagination is especially important when the sequential page system isn’t clear, like a page with infinite scroll.
One one hand, Google has clearly stated that Domain Age isn’t specifically a ranking factor.
On the other hand, Google has also stated that they may look at the date when they first crawled a website, or the age of links pointing to a site.
SEOs have often noted the difficulty in getting a newer site to rank for competitive terms, and called this period the “Sandbox.”
So while there’s nothing specifically stopping a new site from ranking, it’s typically easier for older sites to rank because they’ve had longer to build up link and authority signals. In fact, almost all SEO correlation studies find a relationship between older domains and higher rankings.
12. Website & Business Reputation
In Local SEO, we know that citations and reviews make up two huge pieces of the ranking puzzle. It’s important for a business to get high-quality citations, and it’s important to earn authentic positive reviews to rank higher.
The concept of reputation extends to regular search results as well. Google Quality Rater Guidelines consistently score sites on perceived reputation. We know Google has included analyzing customer experience into its algorithm.
For any online business, real-world customer signals can have a positive or negative effect on rankings. This is especially relevant now that Google can measure, through phones, what stores people visit and how long they stay.
Earning quality citations, positive reviews, and delivering excellent real world and online experiences can make a difference to your online visibility.
13. Use of Semantic Content
Previously, we discussed the use of relevant keywords as an important ranking factor. But what about the rest of the content, i.e., the words, phrases, and topics Google expects to find with a specific piece of text?
SEOs debate, test, and argue about specific methods for determining what Google may use to score topic “aboutness,” or relevance. These include TF-IDF, LDA, Proof Keywords, Co-Occurrence, Entity Salience and other concepts.
These concepts, while different, all predict that specific keywords and topics are more commonly found together than other keywords/topics.
For example, if your search topic were “apple watch,” you’d expect to find certain keywords and phrases more often than others, such as:
… and more
14. Domain Extension
When people first get into SEO, they often ask if they should use a .com or .org (or .marketing!) for their domain name. The general answer is that you should choose a domain extension best suited to your business and branding, but it’s a little more nuanced than that.
Google tends to treat all generic Top Level Domains (gTLDs) equally. Examples of gTLDs include:
.london, .tokyo, .vegas – Region-specific extensions that are treated as gTLDs
In the end, most evidence indicates that the domain extension you use doesn’t really matter, with one huge exception: use of Country Code Top-Level Domains (ccTLDs).
ccTLDs—such as .hk, .au, .fr, and .in—are used by Google to geo-target your website to a specific region. This means that by using a ccTLD, you may influence your site’s ability to show up in search on a region-by-region basis.
Typically, if your website servers users in multiple countries, it’s sometimes better to use a gTLD and specify language/region variations using hreflang.
15. Server Location
Along with international targeting signals such as hreflang, and ccTLD, the location of your server may influence your search visibility within a specific region.
Using a server close to your visitors has two distinct advantages:
While information over the internet can travel close to the speed of light, there are often multiple relays, switches, and cables the signal must pass through, especially over long distances. Having a server nearby can dramatically improve page load speed, which can benefit your SEO all around.
Google may use your location server as a signal for geo-targeting.
Today, use of Content Delivery Networks (CDNs) and cloud infrastructure can dampen these effects by delivering content to locations all over the world. That said, if you serve users in a specific area, it can be helpful to have your server physically located nearby.
Does Google use other sources of web traffic as a ranking signal?
Nobody knows, and it doesn’t matter.
We know that Google watches and records what websites people visit and bookmark through the Chrome browser. It’s not a stretch to believe that they could use this information in their ranking algorithms.
In fact, there have been many experiments and anecdotal evidence of site seeing a boost in Google rankings after increasing site traffic through other means, such as Facebook ads, TV commercials, and viral Reddit posts.
Regardless of whether it’s an actual ranking factor, an increase in both direct and referral traffic helps your visibility in multiple ways. More eyeballs on your content can lead to more links and shares, which directly helps your SEO.
Furthermore, diversifying your traffic beyond Google makes you more resilient to the ups and downs of Google’s algorithm.
SEOs often focus on Reading Level as a possible ranking factor, but that misses the point.
Readability of your content is far more important.
Google is good at parsing words and sentences together, and we know that low quality or gibberish text tends to get demoted in search results. Conversely, content that is too condensed or advanced may not appeal to a wide portion of the web.
Your content should match your audience. Legal and medical journals should use appropriate jargon and syntax. Speaking to your audience is your best bet for matching user intent.
Most of the popular web is both scannable and written at about a 7th to 8th-grade level. In fact, this is the type of text that tends to earn the most featured snippets. In many cases, it’s also the type of content that earn links and shares.
Finally, readability better engages your audience and simply helps your content perform better.
There’s plenty of evidence that Google wants to rank sites that Factually Correct over websites that play fast and loose with the truth.
First, there’s a Google research paper which defines Knowledge-Based Trust, which is a method of scoring websites on the correctness of their factual information instead of external factors like links.
Then, in 2017 Google updated their Quality Raters Guidelines to included guidelines for rating pages with “demonstrably inaccurate content” as a Lowest Quality Page. Generally, when Google includes something in its Search Quality Raters Guidelines, it suggests they are trying to solve for it algorithmically.
Finally, Googlers have recently made several statements indicating their desire to fact check web pages.
So not only can it pay to have factually accurate content, it can potentially hurt to have factually inaccurate content.
19. Author Reputation
We’re reaching the end of the list of Influential Ranking Factors, and here you find Author Reputation. Why is it near the end? To be honest, nobody knows how big of a deal it is.
The idea that Google wants to know who authored a page, and use that information for ranking websites, has been around a long time. In fact, most of this information comes from Google itself.
A Google patent titled “Agent Rank” describes a system for scoring authors not only on reputation (based on how often the author’s work is cited by others) but on subject expertise as well. So if you become an author-authority on “car parts,” Google may rank your content on car parts higher. Most people today refer to this as Author Rank.
Google executives and engineers have frequently been quoted discussed scoring content based on authorship.
Authorship photos were supposed to be a step in this direction. Sadly, the photos were discontinued, but many believe Author Rank lived on.
Finally, Google’s Search Quality Raters guidelines state that High-Quality Content makes clear “Who (what individual, company, business, foundation, etc.) created the content on the page you are evaluating.”
In the real world, it’s difficult to observe the impact of any Author Rank, but it could be significant.
In any case, it’s best practice to make it clear who created your content.
If it’s an individual author, you can include an author byline and author page on the site. The author can also build up expertise by writing authoritative content around the web, and linking author profiles.
If content is authored by a company or group, make sure your About and Contact information is clear and up to date, so it’s obvious who is responsible for your content.
To be honest, this factor should be listed in the “Could be Important” category.
Many SEOs over the years have speculated about the impact of private WHOIS domain registration. There are, in fact, a lot of reasons to speculate that Google monitors domain ownership and might care about WHOIS information.
For example, this Google patent: “… illegitimate domains may be identified. For instance, search engine 125 may monitor whether physically correct address information exists over a period of time, whether contact information for the domain changes relatively often, whether there is a relatively high number of changes between different name servers and hosting companies”
Older statements from Matt Cutts: “…having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”
An older Google Patent suggests that registering your domain for a longer period could be helpful in distinguishing spam site from legitimate sites. The logic behind that patent is that spam sites are rarely registered for more than a year at a time.
While Domain Expiration Length could legitimately be a ranking factor, in reality, any effect is very small. Moz’s 2015 Ranking Factors noted a correlation of .04 between expiration length and rankings, which isn’t considered significant.
On the flip side, too many ads—of any kind, Adsense or not—can hurt your rankings. It’s not unusual to see a slight negative correlation between rankings and the amount of ad space. In 2012 Google announced it’s Top Heavy algorithm which demoted sites with too many ads above the fold, along with its Intrusive Interstitial update in 2016.
4. Keyword Density
This is a tough one to include in “Not Important” factors because yes, you typically need to use relevant keywords, phrases, and topics in your content to rank.
That said, focusing on keyword density—the idea that your content should contain a minimum threshold percentage of exact match keywords—doesn’t typically lead to optimal results. It’s not that keyword usage isn’t important, but rather there are far better ways of optimizing.
What’s much, MUCH more important is if Google and web browsers can properly render the page. If you want to focus on what’s more important, run your URLs through Google’s Fetch and Render tool.
And if you want to be concerned about W3C standards, it’s far more helpful to users to focus on accessibility standards.
6. Multiple H1 Tags
Like W3C validation, SEOs often get obsessed with proper heading structure. One, and only one, H1 heading tag per page is the rule. And it is not to be broken.
Except it doesn’t matter.
What’s more important, found through experimentation, is having clear, large text near the beginning of a page that indicates a headline of sorts. Moz discovered this through experimentation, and it holds true today.
7. SEO Site Submission
If you want to spot cheap, typically worthless SEO services, see if they offer anything resembling “Search Engine Submission Services.”
This was a popular service back in the day to tell different search engines about your website but is worthless today.
Certainly, if you have a brand new site, it’s helpful to let Google and other search engines know, but there are far better ways of doing it.
For all types of search listings—including websites, business, digital content and more—Google offers a variety of ways to get inclusion here: Get your content on Google
8. Dedicated IP
Another old SEO myth that was likely propagated by web hosting companies looking to upsell services.
This was more a problem when Google found a web server full of spam websites. In this case, any other site on the same server using the same IP might look suspicious. If you are concerned that this is your circumstance (it typically isn’t) the simple solution is to find a more reputable hosting provider.
While there were historical advantages to hosting your site with a unique IP in some edge cases, today many hosting companies don’t even offer it as an option.
Let’s be perfectly clear: links from .edu and .gov domains can be fantastic—especially when they are targeted, relevant, and high-authority. For years, SEOs have pursued .edu and .gov backlinks as if they were the Holy Grail.
The myth being that .edu and .gov links might automatically carry more weight, or because they are closer to trusted “seed” sites, they may carry more trust authority.
Sadly, neither the evidence or the data support these claims.
Yes, .edu and .gov links carry value, and you can build an entirely successful backlink strategy on them. But Google has frequently denied that there is anything inherently special about them.
The data supports this as well. Correlation studies using data from Moz, Ahrefs, and SearchMetrics all show that while the total number of .edu and .gov backlinks does correlate with higher rankings, it correlates weaker than the total number of all backlinks.
Both .edu and .gov links can be bad, as well. In fact, many webmasters report receiving “Unnatural Link” warnings from Google after abusing scholarship link building practices.
Put another way: .edu and .gov links can help you to rank, but there’s absolutely nothing special about them.
Build ‘em if you can, but no need to fret about it.
25 Negative SEO Factors 2021
Not all success factors are positive. The ones listed below possess the power to damage your visibility. Proceed with caution.
Meta Noindex Errors
Hidden Text or Links
Piracy / DMCA Notices
Rich Snippet Spam
Blocking Important JS / CSS Files
Overly Long / Complex URLs
Linking to Bad Neighborhoods
Aggressive Ads / Intrusive Interstitials
Over-Optimized Anchor Text
High Quantity of Crawl Errors
False / Misleading / Offensive
Porn / Explicit Content
The whole point of Google is to deliver the opposite of spam, so producing spam isn’t going to help you rank.
Spam takes many different forms. It may include, but isn’t limited to:
Content that adds little value, uniqueness, or substance qualifies as thin content.
Especially damning to Google are thin affiliate sites, which exist solely to lead folks to affiliate sites while adding little extra value.
4. Non-unique Content
While Google doesn’t penalize duplicate content, content that is non-unique can get filtered from search results.
Aside from following the advice earlier in this guide on duplicate content, it’s best practice to have at least 2-3 sentences up to a few hundred words of unique content on each page to have a chance of ranking.
Cloaking is generally defined as the practice of showing certain content to users, while showing different content to search engines – typically for sneaky reasons.
Don’t do this.
Sometimes cloaking is fine if done for the right reason, but these are usually edge cases.
Whether you run a pirate site or not, the number of valid DMCA copyright removal notices your site receives can lower your search rankings.
A single notice or two likely won’t hurt much, but a large number of such removal requests will likely hurt.
12. Rich Snippet Spam
Rich snippets are awesome when you want to make your search results stand out. Which is probably the reason so many people try to earn them with false information such as fake reviews and falsified event markup.
In almost all SEO correlation studies, the total length of the URL is correlated with lower rankings. This is also the case with the amount of numbers and special characters in the URL, e.g., https://example.com/887600o!jshfj#jklsing0098019-874
The converse is also true: shorter, cleaner URLs tend to rank slightly better.
Correlation is not causation, but this could be caused by:
Deep folder structures, often far removed from high authority pages
People being less likely to copy and share long, complex URLs
Whatever the reason, it often pays to keep your URLs clean and tidy.
15. Linking to Bad Neighborhoods
Just like Google’s system rewards sites that link out to high-quality sources, the system also “trusts sites less when they link to spammy sites or bad neighborhoods” (source).
A bad neighborhood is one filled with spam sites, or sites that have been penalized. Gambling, shady pharmaceutical sites, and porn are often targets. Linking to these sites aggressively can put a serious dent in your search traffic.
16. Slow Speed
We’ve covered the multiple outsized effects of making your site fast, but the opposite is also true. Slow sites can be degraded in search results.
Feel free to use ads, but don’t be pushy about it.
18. Over-Optimized Anchor Text
Anchor text over-optimization is too much of a good thing.
Yes, it helps to have keywords in the anchor text of the links pointing to your site. Even exact match anchor text.
But SEOs long ago observed that when your anchor text becomes “too” optimized, rankings begin to fall. It’s as if Google can tell that your link profile doesn’t look natural, so they demote you in search results.
The best advice is to have a deep variety of anchor text links pointing at your site, and even avoiding dictating anchor text when asking for links.
That said, a large number of crawl errors can be a sign of trouble. In the case of 404s, when they are a result of broken links—either internal or external—this can cause problems in the flow of link equity, create frustrating user experiences, and depress rankings from their real potential.
Not every error needs to be addressed, but a rash of bad crawl errors can spell trouble.
20. False / Misleading / Offensive
To crack down on Fake News, Google recently updated its algorithm to demote content that promotes “misleading information, unexpected offensive results, hoaxes and unsupported conspiracy theories” (source).
Google uses its army of Search Quality Raters to flag false and misleading information, and uses that data to train its machine learning algorithms to detect fake and misleading offensive content.
God save the Queen.
21. Porn / Explicit Content
This one’s pretty obvious, but if Google determines your site contains porn or explicit adult content, it will be mostly hidden in search results except for a very narrow range of search queries, and completely hidden with Safe Search turned on.
22. Redirect Chains
301 redirects are awesome! They get people and search robots where they need to go, and they even pass PageRank.
Seventeen 301 redirects chained together, with a 302 at the end, is not awesome.
In general, Google is known to follow only five redirects before giving up (your mileage may vary). Long redirect chains are also prone to break easily, leak PageRank, and generally depress rankings. For best results, shorten redirects to the fewest number of hops possible.
23. UGC Spam
It’s terrific when you produce great content, but not so much if you let users submit spam to your website.
User-generated spam can include spam comments, forum postings, and accounts on free hosts. The general rule is, if you host it on your site, you’re responsible for it. If you let your users run astray, it can hurt your rankings.
Even if your site is squeaky clean (it is, right?) you may still suffer in search results if you have a long history of spam or penalties. Once everything is cleaned up, it can take time for Google to crawl your content and reassess your site value.
SEOs often report that this process can take many months or more.
If your site has a bad domain history, it’s typically best to do a link audit and disavow, resolve any penalties, create new content, submit sitemaps, fix technical issues, and build new links.
If any of these negative success factors impact you and do the hard work of cleaning up, your rankings will return, hopefully sooner than later.
We offers the best professional search engine optimization (SEO) services and Best SMM Panel. Contact us to learn how we can increase your online visibility!