This post is the derivation of the content from the Meetup hosted by PML’s Main Line Digital Marketing Group “Improve Your Website Performance with Google Search Console” in King of Prussia on Wednesday, September 30th, 2015.
Google Search Console used to be called Webmaster Tools and is a website diagnostic tool that will give you a quick snapshot of your overall site health and alert you to any high priority problems. For example, you can know if Google has uncovered any problems with pages that are no longer there or are not working properly, problems with crawling the site as well as get alerts of any actions Google may have taken against your website.
Google gives us tools like Search Console to enable us to help them while they in turn subsequently help us by providing us with important data about our website activity and performance as well as tips for improvement. Search Console helps website owners identify a variety of common as well as not-so-common problems and correct them. As you correct site errors, you are in turn helping Google’s search results by enabling it to include better performing website in its results.
Hopefully, this article will give you a basic understanding of some of the core capabilities of Google’s Search Console and show you some ways you might be able to translate the information into something actionable that will improve your overall website performance!
Within the Search Appearance section of Search Console you will find four options for monitoring and/or improving your appearance in search results.
Schema is a language specifying page components that is supported by Google and other major search engines. These components can be embedded into the HTML code of a page to specify key details of a page. Search engines, web crawlers, and browsers can then extract and process this Microdata from a web page and use it to provide a richer browsing experience for users.
One way Google helps us as webmasters is they allow us to tag site pages of choice or even a series of pages with Microdata from within Search Console. A click on Structured Data will show you the number of pages crawled by Google that contain structured data. If there were errors with the data structure they will be flagged here and you can address the particular page(s) that had any errors.
The Data Highlighter is a pretty smart function. It allows you to tag specific parts of a page and Google will then associate that part with the corresponding structured data markup. This is great because you don’t need coding experience or even access to the server, which is what you would need otherwise.
Enter the URL of a page you want to markup and then decide whether or not to mark up a series of related pages or just a single page.
Pick the type of page it is. You will see a drop down list types of pages that are helped by implementation of Schema.org structured data markup; restaurants for reviews or locations, movies and events for shows and start times, book reviews, and more.
You can choose to tag one page or a series. Google inputs the information in the data fields box as you go along.
You don’t need to pay for a website audit to know which of your website pages have problems with meta information like titles and meta descriptions. Google continuously runs scans of your website and when it finds problems with this information, you will find it listed here.
Sitelinks are something that you can’t activate or force Google to use certain ones in organic results, however, you can use Search Console to eliminate any sitelinks that you do not want to show up. If you notice a page showing up in search results with sitelinks you do not want you can demote them here.
Search Analytics is the crème de le crème of the Search Console tool, in my opinion. It is where we can get the best sense of organic search drivers like keywords, impressions clicks and rankings. This section really can help you get an idea of how your content campaigns are performing. There are lots of options from within this section that can really help you drill down into specific areas of performance. Couple that with a keyword data analysis platform like the PML Experiment that compares date ranges and allows you to create keyword groups to monitor and you really have some powerful information to help drive content development decisions!
Links to Your Site
See the most popular content, the word or phrases that the links are made of and the sources of your backlinks. Spot both positive and negative trends here. Clicking more will expand whichever section you click.
This gives you a list of the internal pages on your site and how many internal links are pointing to them from within your site. So if I were to click about- it will show all the internal pages on the Philly Marketing Labs website with links to it and which pages have the links. This is helpful in monitoring the number of links because too many links pointing at pages of little importance can imbalance a site.
If there were any manual actions by Google they would be listed here and you would get some further information on how to fix it. Remember, anything happening here will be pushed to the messages tab as well.
International targeting is for setting language preferences or rather viewing how your site fares in that regard.
If you have more than one language version of a page, this is where you can check to make sure Google knows the variations of the pages exist, where they are located, and tell you whether or not there are links back to the page from the alternate version. Switching over the Country tab allows us the ability to check off a targeted country and see how the information appears segmented by that particular country.
Want to know how your site or even specific pages fare in Google’s mobile usability tests? Mobile usability is a very important factor to Google now. I strongly recommend you keep this section free of errors when possible.
Index status shows you the number of pages Google has indexed of your site and you can also see a graph depicting the index status over time to pick out any abnormalities. If there were any spikes or cliffs, you can check the dates of this indexing change against activity performed on the site by looking at any of your Google Analytics annotations. For example, a new site launch or a change to content structure might be something already annotated and you may be able to match them up to solve the mystery.
A click on “Advanced” will reveal the the number of pages blocked by robots or if there are any removed pages on the list.
Knowledge is power and the content keywords page shows you how Google rates the significance of particular keywords on your website. If you want to know why, a click on the keyword will give you the pages on which the words were found by Google to be significant. If there are terms you do not want to be associated with, you can take a look at the pages causing the connection and edit them accordingly.
This shows resources used by your site that are blocked to the Google bot crawler. Google shows only resources they think are under your control. Make sure the resources you are blocking you absolutely must block!
The remove URLs page allows you to temporarily remove a page from the Google Index. There are a few ways you can do this outside of Search Console such as from within your robots.txt file, password protecting pages or with the use of robot directives in meta tags, however, in a pinch you can do it here from search console.
Search engine crawling is how the bot identifies your site topics and indexes your web pages so it’s important to stay on top of these. Keep the site as crawl free as is reasonable and don’t allow them to build up into a large level when possible.
The crawl errors report is one of the first places I look if I suspect a site problem. It will show the timeline graph again allowing us to pinpoint time frames of any increase or decreases in reported issues. Toggle between desktop and smartphone results as well as by server error, soft 404 errors and pages not found.
Crawl stats shows you how active the Google bot has been on your site in the past ninety days. See how many pages were crawled per day on average, the amount of data downloaded and the time spent downloading data. Discrepancies here can point to bigger problems. For example, an increase in time spent crawling along with a decrease in downloaded data per day without an increase in number of pages viewed is an indication of a potential bottleneck or a problem with a site resource that may not be working properly.
Fetch as Google
Enter a specific page to see if there are any resources blocked as well as how google renders it in desktop or mobile devices.
Fetch is a quick one and is great for checking or debugging suspected network or site security issues.
Fetch and render runs all the resources in the page including images and is a good way to identify any differences between the way Google sees your page and the way a visitor does.
Next you can test your robots.txt file. Get a view of the actual robots.txt file here as well as notifications of any warnings or errors.
Sitemaps are an important tool to use for search engines and SEO. If you haven’t submitted your sitemap to Google yet, do it here. It helps Google crawl your site and discover pages it might not normally discover during crawling. Google tells you here if there are any errors on your sitemap and how many pages are submitted/indexed too.
URL parameters is a tricky one…
This is an advanced feature. You need to know how to use parameters to use this correctly. Messing around with this without knowing what you are doing could cause catastrophic results like your site being removed from Google’s index! This could be used to help Google distinguish between similar versions of a page or to crawl one page over another.
Lastly we have security issues. This is another page that you do NOT want to see anything showing up. Google will alert you if it feels your site is hacked or having other security issues.
One More Thing…
A final feature I want to show you is a quick action that will help Google serve the correct version of your website in search results.
- Click the gear in the top right
- Click Site Settings
- Check off the preferred version of your domain
So if you want www. in front of your domain, specify that here and Google will use that version when serving the site in results. Additionally, you can also set a crawl rate. In most cases, especially with smaller sites, it is a good idea to let Google decide how to crawl.
Thanks for reading; I hope you learned some ways to improve your website using Google Search Console!