Google’s New Search Console: Improvements You Can Put to Good Use

Published:
20
February 2018
Updated:
05
December 2025
Viewed: 1279 times
Rated: 5.0 / 2 votes
Rate
article

During the latter part of 2017, Google released a beta version of a new Search Console. The release was intentionally limited to a few select users who helped identify issues and provide a chance for Google to resolve them. As of January 2018, the new Console is now available to everyone.

The rollout is still going on, so you may not have access just yet. For now, the new version will be operating alongside the older one. Those who have access are able to use a toggle feature to easily navigate back and forth between the two. Be patient; it will be available to you over the next couple of months. In the meantime, learn a little about what you can expect in terms of improved performance and additional features.

Data Storage

The private beta version released last fall came with a capacity of 16 months of data. Now that the new version is going public, many webmasters will be happy to know that the data capacity will remain at 16 months. That provides plenty of support in terms of analyzing past performance and preparing projections for the future.

Search Analytics is Now Search Performance

Users will notice some older features with newer names. The Search Analytics report is now the Search Performance report. More than just a name change and a cleaner look for the Console itself, the report will include a wider range of data. Users will find enhancements to AMP status information, job postings reports, and an updated Index coverage. The stated goal is to make it easier for users to identify and resolve any current issues. Flags also alert users to the possibility of issues that could cause problems in the future.

Index Coverage Report

The new version of the Index Coverage Report includes readily accessible data on issue tracking. In fact, the new setup includes alerts that let users know when any issue is detected. You can still run reports to get a recap of recently identified issues, but this more real-time approach makes it easier to resolve the problem before it can have much of an impact on your traffic. Another bonus is the automatic confirmation once the problem is fixed.

If you are not sure how to resolve the issue, don't worry. The newest Console generation provides information on how to fix the problem. If the resolution will require involving multiple people, there is also a share feature built into the report. That direct link can be shared with anyone you choose.

AMP and Job Postings Report Features

Who doesn't need AMP versions of their web pages today? How about the implementation of a Job Postings Markup? The new Console includes tools that isolate problems with both of these forms of search enhancement. There's also plans to roll out additional tools that will make it easier to identify the issues in the months to come.

Like the restructuring of the Index Reporting, you will also find tips on how to take care of any issues that turn up on the AMP and Job Postings reports. There is also a confirmation feature that lets you know when the problem is successfully resolved. The Console will also generate a series of reports that provide you with immediate information about how the solution is impacting your pages.

Recap After Testing Your URLs

The new Search Console is easier than ever when it comes to testing the URLs associated with each of your pages. Once you complete the testing, the Console has the ability to compile and provide you with a validation log. Details about which URLs were fine, which ones required some sort of fix, and any remaining fails are readily identified. This makes it a lot easier to keep track of what you've accomplished and what action items you still need to address.

While the public release is rolling out, keep in mind that the new Console is still considered to be in beta testing. For now, the classic Console will remain active and you can toggle between the two versions. During that time, end users have the opportunity to submit additional feedback about the new features and the older ones that are currently active in the beta version. Google is encouraging feedback so the latest Console will be the best ever before the new Console is considered complete and a cut-off date is set for the classic version.

Google Search Console Improvements FAQs

How should I transition my workflows from the classic Search Console to the new one?

List your recurring tasks (e.g., weekly checks, monthly exports) and map each to its equivalent location in the new UI. Rebuild saved filters and comparisons first, because they influence almost every report you’ll use. Run both versions in parallel for a few weeks, noting any gaps or renamed metrics, then retire the classic views once your dashboards and SOPs are updated.

What’s the smartest way to preserve more than 16 months of data?

Schedule monthly exports via the Search Console API (or a Looker Studio connector) into a warehouse or spreadsheets. Standardize field names and date formats so you can stitch reports over years and across site migrations. Keep a “data dictionary” so future teammates understand how historical fields relate to any new report labels.

How do property types (Domain vs. URL-prefix) affect what I see?

A Domain property aggregates all protocols, subdomains, and paths — perfect for getting the full picture. A URL-prefix property is narrower and great for focused monitoring (e.g., a subfolder like /blog/ or a regional subdomain). Many teams use both: Domain for leadership rollups and URL-prefix for hands-on diagnostics.

What user permissions should my team follow?

Grant “Owner” only to a minimal set of admins who handle verification and integrations. Give SEOs and analysts “Full” or “Restricted” access based on need-to-know; remember that shared links can expose sensitive URLs. Review access quarterly, especially after agency changes or staff turnover.

How can I link Search Console insights to GA4 for better decision-making?

Use the landing page and query data to hypothesize causes of GA4 traffic or conversion changes, then validate in GA4’s acquisition and engagement reports. Build a shared sheet/dashboard that pairs impressions/clicks with sessions/conversions for the same pages. This narrows “rank vs. UX” debates and speeds prioritization.

What’s the best practice for international sites and hreflang validation?

Group properties by market (e.g., example.com, fr.example.com) and review coverage/query patterns side by side. Cross-check that each localized URL has a reciprocal hreflang and a self-referencing tag. Track cannibalization by filtering queries where the “wrong” locale earns impressions, then fix internal links and canonicals.

How do I use query and page filters effectively without biasing results?

Start broad (no filters) to establish a baseline trend, then layer one filter at a time (device, country, query regex). Always compare equivalent periods (e.g., 28-day vs. prior 28-day) to offset weekday/seasonality effects. Save commonly used filter sets so analyses are consistent across teammates.

What should I do when I see sudden drops but no obvious coverage errors?

Segment by device and country to isolate where the fall began, then check query families using regex to spot intent shifts. Compare your page’s snippet (title/meta) against top results for those queries — SERP changes can lower CTR even if rank holds. If rank dropped, inspect the affected URLs for internal link loss, template changes, or competing pages stealing intent.

How can I speed up reprocessing after fixes outside of URL testing?

Submit updated sitemaps that include only the corrected URLs to create a tight crawl target. Use the URL Inspection tool for a sample of critical pages to request recrawl, then watch for status flips in batches rather than one-by-one. Keep a change log (date, fix, files touched) so you can correlate validation outcomes with deployments.

What reporting cadence and KPIs work well for stakeholders?

Weekly: top movers (queries/pages), new errors detected, and any validation in progress. Monthly: click-through rate by query theme, device breakdown, and growth of indexed key pages. Quarterly: market/device share shifts, core topic visibility, and a roadmap linking Search Console findings to upcoming content and technical work.

Mike Zhmudikov

Written by Mike Zhmudikov SEO Director

Mike’s influence is deeply embedded in the success narratives of our projects. His ability to foresee market trends, coupled with his adeptness at blending technical SEO knowledge with managerial acumen has culminated in a track record of measurable outcomes and satisfied Clientele.

Bio