As an SEO or website owner, you can communicate with Google using the Google Search Console, also known as Google Webmasters Tool.
Here are the Top 7 features which I use in the Google search console:
1. URL Inspection
As soon as you publish a new page on the website or make any new changes, you should use the URL inspection tool to crawl the page and submit it to Google for indexing it.
Many times, we create a page and start sharing it on social media or other platforms even before Google can crawl and index your page.
You should also submit your URL using the URL inspection tool before sharing the URL anywhere.
2. Search Analytics (Performance)
Data provided by the Google search console (GSC) is very useful for optimizing your website. You can check the following metrics in GSC:
- Search queries
- Average position
- Impressions (How many times your website appeared in SERP)
- Clicks (How many people clicked on your website snippet)
- CTR (Clicks ÷ impressions = CTR. For example, if you had 10 clicks and 1000 impressions, then your CTR would be 1%.)
- Devices (Clicks breakup by device)
- Clicks by Pages and some more data
3. Search Console Insight
Google has introduced a new tool called Google search console Insights (it’s in Beta). You can get the Most Trending Queries and Most Searched Queries of the past 28 days.
To access this data you have to link the Google search console with Google Analytics.
4. Crawl Errors (Coverage)
Google updates all website owners about the errors which Google search engine faces while crawling and indexing the website.
You should review the crawl error at least once a week and fix it. Once you fix the crawl error, you can inform Google that your crawl error is fixed and so Google can recrawl the page.
You can access the crawl errors from the Coverage section of GSC.
5. Page Experience
Page experience is a big part of Google’s ranking algorithm, so you should take this feature seriously.
Google rates your pages based on Core vitals, HTTPS and Mobile experience.
You should have a sitemap on your website which gets updated on a real-time basis or at least on a weekly basis if you have to do it manually.
In your weekly review, you should see if Google is facing any issues with your sitemap.
If any unwanted URL is crawled by Google you can remove it from SERP using GSC Removals section.
You need to block it using the Robots.txt file and then submit a request for URL removal.
Bonus Tips 1:
Links Section: You should use the Links section to see the internal link distribution.
Bonus Tip 2:
Google sends you emails on a weekly basis or whenever any issue occurs. You can be notified as soon as an issue arises, so you can sort it out asap before the page is recrawled.
Raise the Bar!