SEO対策の東京SEOメーカー

Master the URL Inspection Tool in Google Search Console!

url index tool

The URL Inspection Tool in Google’s Search Console is essential for immediately displaying your website pages in Google search results. By requesting through this tool, you can promote indexing. Many want to master this tool for SEO purposes.

Here, we clearly explain how to use the URL Inspection Tool more effectively to ensure your web pages are indexed.

SEO相談

What is the URL Inspection Tool? 

The URL Inspection Tool is part of the functionalities provided by Google’s Search Console, a tool offered to analyze internet searches and check if your website has any issues.

It is indispensable for SEO and is widely used. The tool helps in the very basic step of having your site indexed by search engines.

Worldwide, whether or not your site appears when users search for related topics on Google depends entirely on indexing. Of course, ranking higher or lower is a separate matter, but if it’s not displayed at all, it’s as if your site doesn’t exist.

To get indexed, you need to encourage the search engine’s crawlers to crawl your site and index your web pages. If you just create your site and wait, it might never be discovered. You need to request the crawlers to visit your site.

Only after Google’s robots visit and deem your site problem-free, they will index it. While natural visits from external links can lead to indexing, it usually takes a lot of time. Using the URL Inspection Tool, you can have your site pages indexed by Google within a day or two.

Previously known as “Fetch as Google,” the tool was officially renamed in 2018. It notifies Google crawlers of the existence of your website pages, allowing them to acknowledge and index them. Pages not indexed are like non-existent pages.

Promoting indexing through robots is a crucial step in SEO, recognizing the importance of getting indexed quickly and reliably as the first step.

Things you can do with a URL inspection tool

Here’s what you can do with the URL Inspection Tool.

  • Check if a URL is indexed You can find out whether Google has been able to index a page or if not, why.
  • Test public URLs Check if a site’s pages are eligible for indexing.
  • Request indexing for a URL Request Google to crawl the URL and confirm via a screenshot.
  • Troubleshooting Discover why a page hasn’t been indexed.

Practical Guide: Using the URL Inspection Tool 

The Search Console is a tool you can use for routine checks on your site. 

Here, we’ll explain how to make requests with the Inspection Tool. It’s advisable to always monitor the tool, not only after creating or editing web pages, but also to quickly notice any issues or errors that might arise.

First, log into the Search Console.

Prepare the URL of the web page you want to index in text form beforehand. Once logged in, you’ll find a search box at the top.

Enter the URL you want to register and hit Enter. If the page is not yet indexed, you should see “URL is not registered with Google.”

At the top right of the screen, you’ll see “Test public URL”; click on it. If it says “URL can be registered with Google,” you’re set.

Next, look at the menu on the left and click on the URL Inspection Tool.

Enter the URL you want to register in the search box at the top and hit Enter again. 

If everything is working correctly, you’ll see an announcement saying “Fetching data from Google index.”

It might take some time, but once the results are displayed, you’re done.

Advanced Use: Mastering the URL Inspection Tool Features 

Earlier, we managed to request indexing with the crawler using the URL Inspection Tool. However, the status window displays various pieces of information, and it can be confusing to understand what they all mean.

In this advanced section, we’ll look at two additional features you might want to master besides requests.

Feature 1: Focus on the Status 

The status displays various types of information. For example, if a URL is registered with Google but deemed not optimal, the status may show “Registered but has issues.”

This is like a yellow card; the URL is registered, so it will appear in searches, which is reassuring. However, it’s true that there are areas that are not optimized and could be improved.

A complete error is when the URL is not registered at all. If the status shows “Indexing Error” or “Not Registered,” then at that moment, the web page effectively does not exist.

You can’t just leave it; it’s crucial to promptly investigate the cause and request indexing again. However, be aware of the lag in display updates, so if the status doesn’t change immediately after addressing the issues, it might clear up if checked a day later.

Feature 2: Test Before Publishing 

Testing a public URL, as the name suggests, is a feature that allows you to check a web page before it goes live. When you click, it loads the data for the specified URL, and you can see how it will appear once published. It’s common for pages to look different on your PC compared to on a server, and unexpected displays need correction.

Ensure there are no discrepancies between the content and information registered in the index. For instance, if you’ve made changes to a web page after it was indexed, the information in the index may become outdated.

By performing a public URL test, you can compare on the screen and concretely check how it appears to users. Clicking on the screenshot tab in the window allows you to see the latest information the crawler has in a screenshot. This feature is useful when you want to see exactly what information has been indexed.

Things to Avoid When Using the URL Inspection Tool 

The URL Inspection Tool is essential for SEO, but there are a few things to remember to avoid. Sometimes, when indexing doesn’t go as planned, it can be tempting to rush, but be careful because the URL Inspection Tool might suddenly become unavailable.


Requesting Indexing Multiple Times in One Day
 

Previously, with Fetch as Google, there were set limits on how many times you could request indexing per week and per month. With the URL Inspection Tool, no specific number is provided, but it’s mentioned that there is a limit to how many times you can request indexing in a day.

You might feel the urge to request again if your newly created web page doesn’t reflect changes after a few minutes. However, repeating this too often can reach the limit, and suddenly you might find the URL Inspection Tool unavailable. While the exact number of times isn’t specified, it’s best to avoid making multiple requests repeatedly.

Inspecting URLs with a noindex Tag 

Inspecting a page with a noindex tag using the URL Inspection Tool will result in an “Indexing Error” status. You might also see a message in coverage saying, “Submitted URL has a noindex tag added.” The noindex tag is used not only for Google but for preventing indexing by search engines in general. 

For instance, to control indexing, you might deliberately use a noindex tag on duplicate content or content with low information value. Make sure to check that the URL you are requesting does not include this tag.

Inspecting URLs of Redirect Source Pages 

A redirect is a system that automatically transfers users from a target URL to another URL. This mechanism is used, for example, to smoothly guide users from an old site to a new site. However, the source URL of a redirect cannot be inspected by crawlers. Be careful not to request indexing for the source URL of a redirect.

Inspecting URLs of Images and PDF Files 

Crawlers of the URL Inspection Tool do not work on images or PDF files. While image and PDF searches are functional, entering URLs for these files in the URL Inspection Tool will result in a “Not Registered” status. Understand that the tool is intended primarily for web page URLs.

Top of Form

URL Inspection and Error Resolution Methods 

It’s not uncommon to panic when encountering errors during URL inspections, leading to repeated requests that hit the limit. If an error occurs during URL inspection, it’s crucial to calmly review the details and resolve the issue before making an appropriate re-request. Of course, the nature of errors can vary widely, so it’s not always clear what needs improvement. Here, we’ll explain two common causes for each type of error.

  • Causes of issues with registered items
  • Causes of indexing errors

Causes for “Registered but has issues” 

This error indicates that although the webpage’s URL is indexed, it is negatively evaluated in some respects. A common issue recently highlighted under the “Enhancements” section is “Mobile Usability.”

The notice might say, “This page is not mobile-friendly,” but what does mobile-friendly mean? Simply put, it means the page is difficult to browse on devices like smartphones. Google believes that pages that are easy to view on mobile devices enhance user convenience.

Given that more than half of the users now access sites from smartphones, paying attention to mobile-friendliness is crucial for SEO.

Google recommends creating sites with responsive web design. 

The importance of this factor has been included in evaluations since 2015, and since then, the impact has grown, with smartphone usability becoming increasingly critical.

If this error occurs, consider specific improvements based on Google’s standards, which include

  • Not using software uncommon on mobile devices, like Flash.
  • Using text that is readable without zooming.
  • Ensuring content sizes match the screen size to avoid horizontal scrolling or zooming by the user.
  • Each link is placed far enough apart to easily tap the desired link.

It is advisable to review and improve the relevant parts of your website page. Checking from a smartphone to ensure the layout is easy to navigate and use is recommended.

Causes of Indexing Errors 

This type of error requires swift action. However, the causes can be varied, requiring checks from multiple angles. In cases of new requests, it may just take some days for crawling to complete, so if that’s the case, please wait a few days.

You’ll typically be alerted by an email from Google if there’s an indexing coverage error, so if it’s been registered at least once, you should notice it. First, use the Search Console to pinpoint when and what error occurred with the URL.

Click on “Coverage” in the menu, then click on the error to check the date it occurred. After that, review the details of the error.

Types of Coverage Errors 

There are nine types of coverage errors, including

  • Server error (5xx)
  • Redirect error
  • Submitted URL blocked by robots.txt
  • Submitted URL has a noindex tag
  • Submitted URL looks like a soft 404 error
  • Submitted URL returned an unauthorized request (401)
  • Submitted URL not found (404)
  • Submitted URL returned a 403 error
  • Submitted URL was blocked due to other 4xx issues

In any case, the site becomes completely inaccessible to users, necessitating urgent action.

Summary 

Google’s URL Inspection Tool isn’t difficult to use and, when used properly, can facilitate indexing and enhance SEO measures. However, it’s essential to know how to use its features correctly and how to handle situations when things don’t go as planned. Recognizing that without registration by Google’s crawler, your webpage essentially does not exist is crucial. Furthermore, webpages that are hard for the crawler to recognize or that are frequently delayed in evaluation might be deemed low quality. To avoid such situations, regularly check the Search Console and master the URL Inspection Tool as needed. Of course, providing high-quality content and constantly striving to improve user satisfaction is paramount.

Top of Form

 

Author Profile

SEO Consultant

Mr. Takeshi Amano, CEO of Admano Co., Ltd.

Mr. Takeshi Amano is a graduate of the Faculty of Law at Nihon University. With 12 years of experience working in the advertising agency industry, he discovered SEO and began his research during the early days of SEO. He self-taught and conducted experiments and verifications on over 100 websites. Using this expertise, he founded Admano Co., Ltd., which is currently in its 11th year of operation. Mr. Amano handles sales, SEO consulting, web analytics (holding the Google Analytics Individual Qualification certification), coding, and website development. The company has successfully managed SEO strategies for over 2000 websites to date.

Return to the top of Japan SEO

新着記事

popular

Webmarketing

SEO