SEO For SMEs: Why Can't You Find Us On Google?

Wondering why your website isn't found on Google? Many possible errors can be responsible for this. You will find some of the most common in this article – as well as tips on how to solve or avoid these problems, of course.

Before troubleshooting, make sure that you have covered the following basics. This may not solve the problem, but it is easier to narrow it down.

SEO For SMEs Checklist Audit

SEO For SMEs: Why Can't You Find Us On Google?


Advanced users can add the following steps:

  • Get access to the cPanel or backend of your website. Look at your log files.
    • If you already have access, make sure that page views by users and bots appear in your log files.

After these points are ticked off, we can now come to the possible reasons for your lack of visibility in Google.

1. WordPress prevents search engines from indexing

If the box next to “Prevent search engines from indexing this website” is checked in WordPress, the CMS uses a Robots.txt file to instruct search engines such as Google and Bing not to read your website. This setting is often activated by default.

SEO For SMEs: Why Can't You Find Us On Google?

Even if the website has been online for days, weeks or years, the first thing I look here for is when a page owner complains about a lack of visibility in Google.

2. Robots.txt blocks search engines

If you see the following code in your Robots.txt, you are signaling the search engines not to read your time. 

User-agent: * Disallow: /

You can see the contents of your Robots.txt file if you enter your domain followed by /robots.txt in the address bar of your browser.

How can it come to that? There are different possibilities:

  • The WordPress setting mentioned above keeps search engines from indexing
  • The Robots.txt file was created during earlier phases of the website's development and the disallow code survived the go-live.
  • Due to misunderstandings regarding the functionality of the Robots.txt file, errors creep in.

For example, I often see a line like this in Robots.txt files:

User-agent: * Disallow:

It means that you want your site to be indexed by all search engines. So far, so good – but with just a single additional character, it gives instructions to all search engines to refrain from indexing. I usually recommend not to write this line in Robots.txt because I am worried that someone will come along, add a slash and make the page disappear from Google and Bing.


3. The meta tag “Noindex” is set

Like points 1 and 2, the meta tag “Noindex” is a safe way to exclude a website from Google search results. It can be built into the page header using a WordPress plugin or manually:

<meta name="robots" content="noindex"/>

Check for this problem as follows:

  • Open the website in the browser of your choice
  • Right-click on the page and select “Show Source Text” or similar.
  • Search for “robots”
  • Search for “noindex”

4. Problems with technical changes or website migration

Have you recently moved your website to a new CMS, have you changed the theme or hosting provider, switched from HTTP to HTTPS or did another major migration? Have you changed many URLs and set up redirects – or forgot this? Such changes are likely to affect how your website looks to the search engines and could confuse the robots.

Here you will find checklists for migrations from HTTP to HTTPS as well as website migrations that will help you think about everything and avoid mistakes. They are also suitable for troubleshooting afterwards.

If you have completed a migration or technical change, it is quite possible that you have overlooked one or the other point.

5. JavaScript & Mega Menus

These are two different problems, but they often occur together. If all of your navigation is programmed in JavaScript, search engines may find it difficult to follow the links so that parts of your website are not indexed.

If even the entire website relies on JavaScript, the problems can be even greater.

Mega menus undoubtedly have their advantages, but they can also cause problems and should not be missing from this list. If the crawlers on your website don't seem to be getting through properly, check if sprawling menu code is the reason.

6. Redirects

One of the most common sources of problems after website migrations are redirects. Missing redirects and redirect chains are particularly noticeable. If you move to a new domain or assign new URLs to your pages without creating redirects, crawlers will not find the new addresses.

If you're wondering what a redirect chain is – well, that's easy. This is how you call several 301 or 302 forwardings in a row.

Here is a discussion of John Mueller's (Google)  (late 2018) comments on redirects, PageRank, and reasons to avoid redirect chains. 

7. Redirects with and without WWW

Although this strictly falls under point 6, it was worth an issue of my own. I once had a SaaS provider as a customer. The platform of course had a login page. Some of the customers used white label variants of the service, and a long series of checks in the source code ensured that the right interface was transmitted with every request. Cookies, referrer information and other criteria helped.

A normal user did not see that this page often went through a series of redirects. These included status codes 404, 302, 301 and more – in short, it was a mess. The product team struggled with the problem that despite all efforts, the login page was not placed in the Google index.

As you can imagine, the reason was that Google had no idea what the valid URL for this login page looked like.

What happens on your homepage versions with and without www, each under HTTP and HTTPS? Are they all resolved or does the user jump around like a tennis ball during the game?

If you don't know, I recommend to test it quickly and easily. Simply enter your domain, select “Canonical domain check” and click on “Check Status”.

Ideally, you will see that three of the four addresses in your domain forward to the fourth:

SEO For SMEs: Why Can't You Find Us On Google?

8. Deep Content

Deep content is often measured in “clicks from the homepage” or “click depth”. One of the reasons for this is pagination. Deep content, i.e. content that is many clicks away from the home page, is most often found in large online shops or blogs.

The Semrush Audit offers a convenient way to determine the click depth of your website and the content concerned. The tool tells you which pages are five, six, seven and sometimes even more than ten clicks away from the home page.


SEO For SMEs: Why Can't You Find Us On Google?

Google Search Console

SEO For SMEs: Why Can't You Find Us On Google?

Here is a screenshot of the Pagination of the SEO Blog – as mentioned, Pagination often creates deep content:

SEO For SMEs: Why Can't You Find Us On Google?

But in the SEO blog, the content is found in various ways – tags, user / author pages, internal links in blog articles and others. However, on many websites these other ways of linking internally are not consistently implemented, and the site only uses the home page and maybe a few category pages to link thousands or even hundreds of thousands of products and pages. So if much of your content doesn't show up on Google or Bing and five, six or more clicks away from the homepage, you should conduct an audit and revise your internal link.

9. Page load time

Although it never occurred to me that the loading time made an entire page disappear from Google or Bing, I often see that certain pages or page types are affected. And the fact that I've never seen the visibility of an entire website suffer from long loading times doesn't mean that it doesn't happen.

If your server does not respond to search engine robot queries, or only slowly, this can lower your rankings. Perhaps more importantly, slow pages generate fewer conversions and sales.

There are many tools available on the web to test the loading time of your website, including  a new report from Google Search Console , Google Analytics and  Lighthouse  in Chrome. 

SEO For SMEs: Why Can't You Find Us On Google?


Also read:

10. Facet Search

If you haven't found the problem yet and deep or thin content doesn't seem to be the answer, it may be because your online store is using facet search. Not only does this cause problems on e-commerce sites, but here they are the most common.

What is faceted search and why could search engines have problems with it? Facet search is the option to narrow search results according to several criteria or facets, for clothing such as size, color and brand. This can result in dead ends and duplicate content for the robots, while the many possible combinations put a heavy strain on the crawl budget. These are just examples of the negative effects of a poorly implemented facet search.

In fact, facet search can be extremely useful for users and crawlers. But here are some of the issues that can affect crawling and ranking:

  • They use URL variables that appear in different orders and make the same content look like different pages for Google. The following three URLs could play the same result list, for example:
    • /mens/?type=shoes&color=red&size=10
    • /mens/?color=red&size=10&type=shoes
    • /mens/?type=shoes&size=10&color=red
  • Multiple paths , i.e. page navigation vs. search
    • If you click through to a mountable and Smart TV-compatible television set at Best Buy, you can access
    • The URL for this, which is very well placed on Google, is pcmcat220700050011_categoryid% 24abcat0101001 & type = page & usc = All% 20Categories
  • Limited crawl depth  – Limiting the options of a facet search can also limit the chances of ranking and prevent you from showing all your strengths. Example: 
    • Click at Grainger before 7CHigh + Speed ​​+ Steel & filters = attrs
    • However, if you search for “High Speed ​​Steel Counterbores with Built-In Pilots” on Google, you will find a similar but not the same page at -port-tools / counterbores-with-built-in-pilots.

Although each individual case is individual here, the solutions often include:

  • Canonical-Tags
  • Disallow via Meta-Tags
  • Disallow via Robots.txt
  • Nofollow attributes for internal links
  • JavaScript and other methods to hide links

11. Your website was punished by Google

In a way, this is the opposite of Problem No. 5 above. In this case, there are not too few links to your pages, but too many – toxic links. They could also have been hacked and therefore removed from the index. However, it is more likely that your website simply has too many toxic backlinks.

So how do you find out if Google punished you? Google (or Bing) should let you know. But if you are not sure, you can find information about your website here:

  • Google Search Console
  • Bing Webmaster Tools
  • Search your email inbox that you use for the Google Search Console and search for “google” and “penalty” to check if you may have missed the email.

12. Your site has no backlinks

If your website is new, you probably don't have any or very few backlinks. Building a solid backlink profile takes time. Use the Google Search Console to find out how many incoming links your website has.

Also read: How to Build SEO Backlinks

ven without backlinks, your site is tracked by Google and Bing, but you are not likely to be found with competitive keywords. So if your website is technically fit and has findable, optimized content but no backlinks, this certainly plays a role in the lack of visibility.

I have often reached and seen that websites without backlinks had good rankings in the search result. But this is not always possible for every website in every industry.

13. Your website lacks content (and context)

When visual content and textual information compete against each other, texts often lose out. As a result, many websites are visually appealing, but leave little space for content and context.  

Imagine an art gallery where the walls and floors are white, which makes the art pieces stand out more. Next to them are small tablets with the name of the artist and the title of the work. If your website is structured similarly, context is missing – and search engines need context to understand the art, the artist, and the reasons for including the work in the exhibition. 

Often the entire content of a page is in an image accompanied by little text, and sometimes nothing is really explained what the company offers.

Test your website or its pages by copying only the main text and showing it to someone. Ask the person what the site is about and whether there is a need for clarification. Based on the content, ask questions about your company and its offers. If you offer little content and context on your pages, you will probably notice this from the answers of the test person.

If you run a fish restaurant but don't mention fish and dishes anywhere, users or search engines won't know what your page is about. If your product descriptions are the same as on all other pages, and none of them contain more than 30 words, the search engine has no way to distinguish your page from the competing ones and to get an accurate picture of your offer.

14. Your keywords are competitive

The problem is often not that a website is not visible, but that it is not found with the keywords that are important in the specific niche or for which the boss would like to be visible. Our in-depth keyword research guide helps you better define your target keywords (and manage management's expectations).

That doesn't mean you will never have a chance to be visible with the highly competitive keywords in your industry. But if your website is new, you need to build that visibility and start with your branded and less competitive longtail keywords.


We offers the best professional search engine optimization (SEO) services and Best SMM Panel. Contact us to learn how we can increase your online visibility!

SEO For SMEs: Why Can't You Find Us On Google?

SEO For SMEs: Why Can't You Find Us On Google?

BEST SMM PANEL SMM Panel, has the Cheapest SMM Panel and 100% High Quality for all social networks. Get the best Instagram, Youtube, Spotify  panel today!

Subscribe to our newsletter!