If you haven’t verified your site with Search Console yet it can take several days for some data to populate, often several weeks if you want to effectively use some of these tools (eg. crawl errors and search analytics). But you can still optimize your site in many ways regardless if you just set this up.
Paste code into Rank Math (General Settings → Webmaster Tools)
Delete everything outside the quotations (including the quotations)
Save changes in Rank Math plugin
Click ‘verify’ in Google Search Console
It will take a few days for some data to populate in Search Console
Same HTML Tag verification process can be used for Bing + Yandex
Google Search Console
If you’re not using Rank Math they have plenty of alternative methods to verify your site.
2. Structured Data
The most common form of structured data are rich snippets which add “extra information” to your snippets in the form of review stars, recipe information, event information, and other data types. Here’s a gallery showcasing a few different types of rich snippets and rich cards
Popular Rich Snippet Plugins
WP Rich Snippets – premium plugin ($69 – $499) with robust functionality and is what I use on my site. Design is nicer and settings are more robust than the free All In One Schema plugin. Add-ons let you add even more functionality like the ability to add user reviews, comparison tables, and use the markup anywhere in your content (not just the top or bottom of a post).
Checking For Errors
When you’re done adding rich snippets to content you can use the structured data section of Google Search Console to keep track of pages you markup, check for errors, and see ratings. You can also use Google’s structured data tool to test a single page, but this shows all pages…
3. Rich Cards
Rich cards are an updated form of rich snippets. They are a left-to-right carousel display and can only be done with recipes and movies in US English mobile search results. I don’t think there are any WordPress plugins that support rich cards as these are relatively new, so you would need to follow the markup from the gallery, but I’m hoping a plugin will come out soon.
4. Data Highlighter
Data highlighter is an alternative to using a rich snippet plugin. It’s a “point and click” tool you can use to highlight page titles, dates, and other required fields needed to show rich snippets in search results. Once you do this with a couple pages or posts, Google will apply these patterns to your entire site… so you should only need to do this with a few pieces of content.
5. HTML Improvement
Tells you whether your snippets (SEO titles + meta descriptions) are too short, long, or contain duplicates. In Yoast there is “progress bar” that should turn orange (bad) or green (good) depending on the length so as long as you’re writing these long enough to be green you should be fine. If you haven’t been, the HTML improvements will tell you which pages need to be fixed.
To prevent errors in the future, make sure your ‘length progress bar’ in Rank Math is green…
Rank Math SEO
Rank Math SEO
Use Rank Math’s Bulk Editing To Fix These
You can bulk edit your SEO titles + meta descriptions in Rank Math SEO → Title & Meta → Post Types → Post → EnableBulk Editing. Keep in mind the bulk editor doesn’t have the “length progress bar” or show you that page’s focus keyword like the content analysis… but you will still need to incorporate both
6. Accelerated Mobile Pages
AMP for WP automatically adds Accelerated Mobile Pages (Google AMP Project) functionality to your WordPress site. AMP makes your website faster for Mobile visitors. check this plugin Rank Math AMP
Adding AMP Pages To WordPress
Install the AMP plugin by Automattic (adds the AMP pages)
Install the Rank Math AMP if using Rank Math (customizes the design)
Add /amp/ to any page on your website to see how it looks and make sure it works
Wait for Google to recrawl your site and add the AMP sign in mobile search results
You can even use advanced index/no-index options of Archive subpages, Author Archives and Date Archives in AMP. The below picture shows the switches which allow you to select index/no-index in AMP.
URL Inspection tool compatibility option has always enabled this index in enabling in the google search engine.
URL Inspection tool
If you disable this particular options then Google will not index that particular at archives pages
This will make you rethink how you measure SEO if you haven’t used it. You can measure rankings (position), keywords (queries), CTR (click-through rates) and more. There’s ton of cross referencing you can do but I listed 5 examples below which I found the most useful.
Navigate to search analytics and tweak the filters to what you see in each dashboard…
See keywords (queries) you rank for…
Queries for specific products, services, or topics you blog about. Simply adjust the query filter to include all queries containing “SiteGround” (or whatever keyword you want to see)…
Queries used to find content in Google Images…
8. Links To Your Site
From my experience, this is the easiest way to find low quality and irrelevant sites that link to you so you can remove them. This results in a cleaner link profile and will minimize risk of any Google penalty, and can even improve your rankings especially long-term. Go through your links and identify low quality or irrelevant sites who link to you. This may require a little research and judgement on determining which links are “authentic” and which ones are not. Moz has a great article on performing a link audit and removal if you want to read up on this.
Links To Your Site
Links To Your Site
Disavowing Bad Links
Once you have your list of URLs, contact the webmasters and ask them to remove it (definitely the preferred method). If for some reason you can’t get it removed, use the disavow tool. Even if you haven’t been hit with a penalty Matt Cutts recommends disavowing questionable links as a preventative measure. I try to go through my links about once every year to clean them up. Of course you only need to do this if you have a decent sized site with a good amount of links.
Disavowing Bad Links
9. Internal Links
See which pages you are linking to the most…
10. Manual Actions
A manual action is type of Google penalty. These are the 2 most common ones…
Unnatural Links To Your Site
if you hired a link builder and they built a bunch of spammy links, this is probably the reason. Stop doing this right now and go through all links to your site (step 8) and try to get them removed by contacting the webmasters or using the disavow tool.
Thin Content With Little Or No Added Value
means you need to beef up your content. Short, non-useful, and duplicate content are all big no no’s in SEO.
11. International Targeting
Target your website to a specific country. This does NOT completely exclude it from all other countries (it’s just a signal). If you are international, you should leave this option unchecked.
Shows how many pages are being indexed by Google. Be sure to select the advanced option and enable “blocked by robots” and “removed” to see blocked resources and removed URLs.
What To Look Out For
Graph should be a steady increase, which means you are consistently adding new content to your site that Google can index
Sudden drops should be investigated and can mean your server is down or overloaded (in which case you should upgrade your hosting or reduce CPU consumption)
Sudden spikes can be caused by duplicate content or even hacks
14. Blocked Resources
Tells you whether your robots.txt blocks Google from crawling certain resources. Also guides you in the unblocking process, however you may want to keep some resources blocked if you don’t want Google to crawl them. You can only unblock resources you host, own, or have access to the file’s robots.text (since this will need to be edited to unblock it). Otherwise you will need to contact the owner of the resource and see if they will edit the robots.txt for you.
After you have found the blocked resource you can use Fetch and Render to view the page as Google and decide whether this impacts your SEO. If you decide you want to change it, you will need to verify the host then update the robots.txt file to unblock it. Remember, if it’s a third party resource you will need to contact the owner of the resource and have them do it for you.
Crawls errors are broken pages on your site and can happen if you:
Redesigned your site
Failed to setup redirects
Deleted content from your site
If you just setup Search Console it can take at least 1 week to populate all crawl errors. When you see them, you’ll want to go through each tab (desktop, smartphone, server error, soft 404, not found, other). Each one will have different URLs that will need to be fixed with redirects…
17. Crawl Stats
You ideally want a steady increase as you add content to your site…
18. Fetch As Google
Fetch as Google tests whether Google can access a page, how it renders it, and any resources (eg. images or scripts) that are blocked from Googlebot. It can help debug crawl issues and make sure your page’s URLs are accessible to Googlebot.
19. Robots.txt Tester
Check if a URL is being blocked from Google and whether there are errors. The most common error is a crawl delay which may occurs if you limited your crawl rate in your site settings (should only be done if Googlebot is slowing down your server and causing CPU/bandwidth limits on your hosting account). But this rule is ignored by Googlebot, so no action is needed.
Here’s how to submit your Rank Math XML sitemap to Google…
In Rank Math settings go to Rank Math → Sitemaps Settings
Configure your sitemap to exclude tags, affiliate links, etc (see photo below)
Same sitemap submission process is be used for Bing + Yandex
Here are a few screenshots if you need them…
21. URL Parameters
The high majority of you will not have issues with URL parameters and will see this message, but if Google shows a different message, you will need to follow their instructions. Here’s a Youtube tutorial by Google Webmasters that shows you what to do. Just be extra careful because improper actions can result in pages no longer appearing in Google’s search results.
22. Security Issues
Bottom line… if you have issues here you should contact Sucuri who can help you fix these. But you should do this NOW since security issues can jeopardize your entire website and SEO. Often this means identifying malware that has been added to your site and deleting these files.
Site settings are in the gearbox option in the top right of your Search Console dashboard…
it’s a preference whether you want to include www in your domain (there’s no ‘right’ way for SEO). Whichever one you chose in the site settings should be the same in WordPress (Settings → General → WordPress Address + Site Address). I would avoid changing this if you already have an established domain as this changes ALL links on your site.
the most common use for this is if your website constantly goes down due to capacity, bandwidth, or CPU limitations on your hosting account. This means your hosting plan does not include enough resources to run your site so you either need to reduce these (eg. by deleting plugins that consume a lot of resources or enabling WordPress heartbeat control), or upgrade your hosting. Limiting the crawl rate will tell Googlebot not to crawl your site so fast which helps reduce the server resources it consumes. This is the only time you should do this.
25. Change Of Address
If you ever decided to change domain names, this will help maintain rankings…
26. Google Analytics Property
Enabling your Google Analytics property allows you to see Search Console data in your Google Analytics reports. Just select the analytics web property and save changes…
Now login to Google Analytics and go to Acquisition → Search Console…
Google Analytics Property
Google Search Console FAQ
Does Google Search Console help SEO?
Taking a look at the facts of the matter, if Google Search Console is introduced into SEO, it will help introduce additional streams of data that SEO can leverage on to perform better. Google on its own cannot link directly with the client. There is this lack of a direct line to the clients. With the introduction of Google AdWords, there is the delivery of direct call numbers to the clients which will help advance the reach to more customers. This will deliver a dedicated specialist for large accounts. It can be said that Google Search Console is of immense help to SEO.
What is covered in the Google Search Console?
This is the very tool that makes it easier to be ranked in position O. It is a new introduction that was introduced in August of 2017. This is an entire coverage within the ambits of Google Search Console. This is a perfect way of identifying the number of pages that are getting indexed. When errors occur in a page, such will not be index. This report also goes to identify the number of such pages that are not been indexed due to errors. The above represents the coverage of Google Search Console.
What is the coverage issue in Search Console?
This happens when the search console identifies that a site is affected by the new Index coverage issue. The practical implication of this can be addressed in this way: hey you've told us there is URL on your site but it has noindex directive so we are having issues indexing it. This is a very serious issue that is capable of destroying the campaign if it is not immediately attended to. When this happens, the index coverage on the site will be negatively affected by Google Search results. You are encouraged to immediately fix the issues.
What is the difference between Google Search Console and Google Analytics?
Google Search Console is search engine oriented and it gives the tools as well as the insights that the search engine owner will require to help improve on their visibility as well as their presence in SERPs. On the other hand, Google Analytics is user-oriented and it is responsible for the provision of data to those that visit and interact with your site. It gives data points about the performance of your site while Google Search Console is the tool used to improve and optimize the website. The above definition separates the two from the other based on their performances.
What is crawling and indexing?
The two concepts above are two different things but there is the mistake of taking one to represent the other. We are going to look at the definition of each to separate one from the other. Now, crawling happens when Googlebot looks at all the content/code on the page and takes the step to analyze such. On the part of Indexing, what it involves is the readiness of the page to show up on Google's search results. Form the individual definitions above, it can be seen that the two are different concepts that should not be mistaken for each other.
What are the benefits of Google search console?
It can be effectively used to check the indexing status on a website as well as optimizing the visibility on the site. Talking in terms of monitoring; it helps to carry out the following very critical functions on a webpage: Resolve server errors, Helps in resolving site load issues, It can also be used to resolve security issues such as hacking and malware, You can also use it to ensure site maintenance, It is used to undertake site adjustments that you make happen concerning search performance. The above represents the benefits of the Google search console.
What is Google Tag Manager used for?
Marketing tags are an essential part of marketing which ensures that set targets and goals are achieved. These tags ought to be deployed and effectively managed if the results that matter is to be achieved. The Google Tag Manager is a free to access tool that helps in coordinating the affairs of marketing tags. This is effectively carried out without modifying the code on your webpage. It is also referred to as GTM and you can get to know how it works through templates that you can get online. You can easily deploy this tool on your website or your mobile App.
Get Help Fixing Errors In Search Console
JasaSEO.be is a WordPress developer I found on freelancer.com who I’ve been working with for over 5 years. He’s helped me (and clients) fix errors related to virtually everything in Google Search Console (mobile, security, www, sitemaps, crawl errors). If you have a question about the SEO side of things you can leave me a comment and I’ll be glad to help you, but if you need help actually fixing errors, Pronaya is the man for that. He’s $40/hour and his email is firstname.lastname@example.org.
We offers the best professional search engine optimization (SEO) services and Best SMM Panel. Contact us to learn how we can increase your online visibility!