Last spring Google re-branded Google Webmaster Tools to a name that was a little more user-friendly, . Most small business owners are familiar with Google Analytics and setting up Google’s popular software is often one of the first things that a business owner will do upon launching a new site. They’re interested to see who’s visiting their site and what they are doing.
What many business owners don’t realize is that before your website starts seeing results, you’ll need to make sure that Google has crawled, indexed and ranked your website. That’s where Google Search Console comes in. It’s Google’s way of communicating with website owners and helping them optimize their site in order garner maximum benefits from the world’s largest search engine.
Google Search Console doesn’t have the flashiest interface you’ve ever seen, there is more raw data than colorful charts and graphs. But as a small business owner, a little bit of time spent utilizing the platform can help you to improve your website visibility and placement in the SERPs.
Let’s take a closer look at how you, a small business owner, can use Google Search Console to your benefit.
Why You Should Use Google Search Console in Under 100 Words
Google Search Console is something that Google provides for free to anyone with an account. It essentially opens up a two-way line of communication between the site owner and Google. Search Console will help you monitor, maintain and improve your website presence within Google’s search results.
More specifically, you’ll be able to see which search queries are applicable to your website and where you might be able to improve your ranking. You’ll also be able to manage your sitemaps, index submissions, and determine how well Googlebot is able to crawl your website.
Getting Set Up with GSC
We’re not going to spend a lot of time on this step because Google makes the process pretty darn easy. Once you’ve opened an account with Search Console, it’s as simple as the following 3 steps:
- Click the red “Add A Property” button
- Enter your URL
- Select a verification method and click “Verify”
That’s about all that’s required to get started. Note that there won’t be any data available when your site is first added. It will take some time (days and occasionally weeks) for data to begin populating. However, you can still use some of the features discussed below in the meantime.
Improving Your Website Performance With Search Console
As a small business owner, you want to make sure your website is well indexed and maintaining a competitive position in the SERPs. You probably also realize that any attempt to game the system or to artificially boost your rankings is likely to result in a penalty that is difficult or even impossible to recover from.
That does not mean, however, that the best course of action is inaction. Not too long ago we posted an article right here on Carpetcleaninghaddontownship the made the argument that contrary to popular belief, SEO isn’t dead. Google actually wants you to make sure your website is optimized for search engines and that’s where Search Console come into play.
From the moment you log into Search Console, it becomes apparent that there is a lot of information and a lot of functionality to sort through. The question is how often should you be reviewing the data and which areas should you be focusing on in order to improve your site visibility?
Depending on the size of your site and how frequently you publish new content, once or twice each month should be adequate. While it’s worth familiarizing yourself with every part of Google Search Console, here’s where you should focus your time and energy:
- Structured Data
- HTML Improvements
- Search Analytics
- Index Status
- Content Keywords
- Crawl Errors
- Fetch as Google
- robots.txt tester
Let’s take a closer look at each one of these areas:
When Google returns search results to users, those results often contain a unique presentation of certain information. This could include information that is specific to a recipe, like average rating and number of reviews. Or, it could include information about a movie title, product or event. For a local business, it could be even more simple, consisting of location and information.
Structured data is the standard method of annotating specific details that are relevant so that search engines are able to understand what the information really means or represents.
If your website contains enhanced information that would typically be displayed in the SERPs, it’s a good idea to keep an eye out for any structured data errors that might have a negative impact on how your site is displayed to searchers.
If for some reason your theme does not use the required markup, Google offers a tool in Search Console called, “Data Highlighter”. This tool will allow you to easily highlight data and apply a specific and relevant tag. Once the data on a page is tagged, Google will automatically tag similar pages in the future anytime new content is created.
As Googlebot crawls your website, it will conveniently make a note of any HTML improvements that might help to improve the user experience of your website or its overall performance within the index. Specific improvements that you want to watch out for include:
- Duplicate meta descriptions
- Long or short meta descriptions
- Issues with title tags
- Non-indexable content
Search Analytics is located under the Search Traffic tab and can provide valuable insights into how well your site is performing for specific queries. It’s important to remember that just because your website is displaying for a particular query does not mean that potential traffic generated by that query is relevant. This is where you have to give some thought as to which queries are most relevant to your website and its content.
Google recommends that you begin by sorting your queries based upon clicks rather than impressions. This will give you a more accurate picture as to which queries are actually driving traffic to your site. You can do this by checking all four boxes:
- Clicks – How many time a user clicked through to your website from a query
- Impressions – The total number of impressions
- CTR – Click through rate or the percentage of time a users clicked through to your site
- Position – The average position of your site when presented to users
Then, to sort by clicks, click the appropriate header:
Although you can’t see all the data (for privacy reasons), you can see from above image, that the first query for this particular site received 3 clicks. It just so happens that this was a branded term, making it less relevant. The second query in the list shows 1 click, 168 impressions and an average position of 19. It’s actually a highly relevant local term but as you can see, this site (which is relatively new) is performing poorly for that query (only a 0.6% CTR).
Note: I have avoided the use of a country filter in order to make this example clearer. If your business targets a specific country, you might consider the use of an appropriate filter.
As a business owner, I would be looking at the results above and thinking to myself that I need to improve my average position and click-through rate of the second term. Doing so could attract more qualified clicks and probably lots of additional business.
If you click on the actual query and then select pages, you’ll see which pages are ranking for this query. You can then take a closer look at each individual page (only the home page in this case) and examine the on-page SEO for each one. You could also compare the pages of your competitors. What are they doing that is better?
This is an abbreviated version of the process, but it should be enough to get you started with analyzing and making improvements. You can work through all the relevant queries and pages, giving thought to how each one could be improved.
Located under Google Index, Index Status shows the total number of pages of your website that Google has crawled and indexed. Although there is not a lot of information here, this simple chart can tell you quite a bit about the quality of your link structure.
If you publish content on a regular basis, you should see your total indexed pages demonstrating a steady uptrend. On the other hand if you see a rapid drop in the total number of indexed pages, it should be cause for concern. For example, Google may have detected malware on some pages of your site. Alternatively, if you’re publishing more pages than Google is indexing, is it possible that your internal linking structure needs some fine-tuning?
You’ll also notice an advanced tab that when clicked will indicate the number of pages that have been blocked as well as removed from the index.
As Google crawls your website, it will take note of the keywords it finds. The significance of each keyword will depend on how often Google finds it on your site.
With this particular report, you will be able to get a clear idea of how Google is interpreting your site content. What does Googlebot think your website is about? The actual subject matter of your website should be reflected by the report.
Finally, if you notice keywords that look out of place or are inappropriate, this can be an indicator that your site may be been hacked. For example, if keywords related to “Levitra” or “Viagra” are on the list, there is a good chance that you site has become victim to a .
On an intermittent basis, you should examine your crawl error report for any issues and fix items that are listed. Although something as simple as a 404 error might not incur a penalty from Google, you’ll be able to determine where the link originates ans then fix the problem. The end result is a better user experience.
Fetch as Google
Fetch as Google is kind of like a manual Googlebot. Anytime you add new important content or want to check an existing page, you can use the fetch tool (with optional render). Using this tool will provide the most accurate answer as to how Google will “see” and render a particular page. This tool is a great way to make sure that your page and all the content on the page can be accessed by Google. This increases your chances of performing well in the SERPs.
Once you’ve entered a URL and clicked “Fetch” or “Fetch and Render”, Google will begin crawling the page. After a short wait, it will return a result that indicates the Googlebot type and status. The status could indicate complete, partial, redirected or another specific error.
If you click on the result, you’ll have access to additional information that shows what was fetched along with the download time. The rendering tab will show a comparison of how Google sees the site versus how a visitor sees the site. Any resources that were unreachable will also be listed here. This could include images, scripts or stylesheets.
Your robots.txt is a simple file that provides instructions to robots, or web crawlers (including Googlebot) about how to crawl your site. Actually, they’re more like a request than instructions since a robot could choose to ignore the request, as is often the case when there is malicious intent.
You can use the robots.txt Tester to determine whether Googlebot is able to crawl a specific URL or whether specific content that you want to be blocked is working correctly.
Google makes it clear that a sitemap does not guarantee every item in your sitemap will be indexed. They also state that having one, although not required, is still a good idea. You’ll never be penalized for doing so.
There are some instances in particular where both having and submitting a sitemap through Google Search Console is highly recommended:
- If your website is new and/or has very few external links, it’s possible that it will take longer for Google to find and crawl the pages of your site. As soon as your site is ready to be viewed, it’s a good idea to create a sitemap and submit the URL to Google.
- If your website has a poor internal linking structure or is very large, it’s easier for Google to miss new content.
Adding a sitemap is a simple process. Once it’s been created using a simple tool like or , simply paste the URL of your sitemap into Search Console and click “Test Sitemap”. Once Google is done you can view the results and if everything looks good it’s ready for submission. That means going back to the main screen and again clicking “ADD/TEST SITEMAP”, only this time, enter the URL, click “Submit Sitemap”. Maybe in the future Google will fix this double entry issue, but for now we have to live with a poor user-experience.
There are some additional guidelines for larger sites, such as separating a large sitemap into several smaller ones and then using a sitemap index file. As your site grows in size, keep this in mind.
At least on a monthly basis it’s a good idea to check your sitemaps for any errors or warnings and if possible, either resolve them or mark them as fixed.
Although there are a few Search Console features that we haven’t covered here, we’ve managed to touch on the ones that have the greatest impact on your overall website visibility. If you’re a small business owner, these are all items that should be on a list of monthly to-do’s.
One additional point worth discussing is the importance of making sure you’ve enabled email notifications from within the settings tab. Your can request that Google email you for either all or just critical issues. For example, if they detect malware on your site, you be notified so that you can resolve the problem right away instead of when you happen to notice it.
If you’ve created a website with lots of great content, it’s important to make sure that Google is able to crawl and index it. Doing so will give your site the best chance of performing well in the SERPs. Google Search Console is not a flashy interface, but it is full of useful information and it is something you should be reviewing at least on a monthly basis.
If you’re currently using Google Search Console, please share how has it benefited the visibility of your website.
Article thumbnail image by Bloomua / shutterstock.com