In this google search console tutorial, we will details about Using External Reports And Tools In Search Console Rightly


Test your robots.txt with the robots.txt Tester


The robots.txt Tester tool shows you whether your robots.txt file blocks Google web crawlers from specific URLs on your site. For example, you can use this tool to test whether the Googlebot-Image crawler can crawl the URL of an image you wish to block from Google Image Search Open robots.txt Tester


You can submit a URL to the robots.txt Tester tool. The tool operates as Googlebot would to check your robots.txt file and verifies that your URL has been blocked properly.


How To Test Robots.txt File


  1. Open the tester tool for your site, and scroll through the robots.txt code to locate the highlighted syntax warnings and logic errors. The number of syntax warnings and logic errors is shown immediately below the editor.
  2. Type in the URL of a page on your site in the text box at the bottom of the page.
  3. Select the user-agent you want to simulate in the dropdown list to the right of the text box.
  4. Click the TEST button to test access.
  5. Check to see if TEST button now reads ACCEPTED or BLOCKED to find out if the URL you entered is blocked from Google web crawlers.
  6. Edit the file on the page and retest as necessary. Note that changes made in the page are not saved to your site! See the next step.
  7. Copy your changes to your robots.txt file on your site. This tool does not make changes to the actual file on your site, it only tests against the copy hosted in the tool.


Limitations of the robots.txt Tester tool:


  • Changes you make in the tool editor are not automatically saved to your web server. You need to copy and paste the content from the editor into the robots.txt file stored on your server.
  • The robots.txt Tester tool only tests your robots.txt with Google user-agents or web crawlers, like Googlebot. We cannot predict how other web crawlers interpret your robots.txt file.


Access Search Console data in Google Analytics


If you associate a Google Analytics property with a site in your Search Console account, you’ll be able to see Search Console data in your Google Analytics reports. Note that you can only associate a website; you cannot associate an app. Associate Properties


You must be an owner of the Google Analytics property to be able to associate it with a website in Search Console. You can open the Google Analytics association page from the property settings dropdown in Search Console.


When you associate a site in your Search Console account with a Google Analytics property, by default Search Console data is enabled for all profiles associated with that property. As a result, anybody with access to that Google Analytics property may be able to see Search Console data for that site. For example, if a Google Analytics admin adds a user to a profile, that user may be able to see Search Console data in Search Optimization reports.


A site can be associated with only one property, and vice versa. Creating a new association removes the previously existing association.


Every Google Analytics property can have a number of views. When you associate a site with a property, clicking a link to Google Analytics from Search Console will take you to that property’s default view. (If you previously associated your site with a different view, clicking a link will now take you to the default view instead. If you want to see a different view, you can switch views from within Google Analytics.)


If your site is already associated with a Google Analytics property, it could be for a couple of reasons. Either you already used Google Analytics to associate this property with the site, or another site owner has made the association.


If your site is associated with an Analytics property you don’t recognize (and don’t want), it may be because another site owner associated the site with an Analytics property you don’t own. In this case, you can delete the association and create a new one.


If your site used to be associated with a property, but no longer is, it may be that the property was later associated with a different site. (Remember, a site can be associated with only one property. Creating a new association will remove the previously existing association.)


You can also create an association using the Analytics admin page if you’re an account administrator for the Google Analytics property. Google Analytics account administrators can move their Analytics property from one Analytics account to another. Any associated Search Console properties will maintain their association as part of the move. After the move, any users of the new Analytics account will be able to see data from the associated Search Console property without notification in Search Console. Learn more.


Removing Search Console data from Google Analytics


To remove Search Console data from a Google Analytics property, unlink the association using Search Console’s association page, or manage association in the  Analytics admin page (if you’re an administrator for the Google Analytics property).


Why doesn’t Search Console data match Google Analytics data?


Search Console data may differ from the data displayed in other tools, such as Google Analytics. Possible reasons for this include:


  • Search Console does some additional data processing—for example, to handle duplicate content and visits from robots—that may cause your stats to differ from stats listed in other sources.
  • Some tools, such as Google Analytics, track traffic only from users who have enabled JavaScript in their browser.
  • Google Analytics tracks visit only to pages that include the correctly configured Analytics Javascript code. If pages on the site don’t have the code, Analytics will not track visits to those pages. Visits to pages without the Analytics tracking code will, however, be tracked in Search Console if users reach them via search results or if Google crawls or otherwise discovers them.
  • Some tools define “keywords” differently. For example:
  • The Keywords page in Search Console displays the most significant words Google found on your site.
  • The Keywords tool in Google Ads displays the total number of user queries for that keyword across the web.
  • Analytics uses the term “keywords” to describe both search engine queries and Google Ads paid keywords.
  • The Search Console Search Analytics page lists show the total number of keyword search queries in which your page’s listing was seen in search results, and this is a smaller number. Also, Search Console rounds search query data to one or two significant digits.


Block crawling of parameterized duplicate content


When and how to use the URL Parameters tool


URL parameters and duplicate content


If your site uses URL parameters for insignificant page variations (for instance, color=red vs color=green), or if your site uses parameters that can show essentially the same content using different URLs (for example,,long-sleeve and, Google might be crawling your site inefficiently.


Here is an example of URLs that lead to essentially duplicate content, distinguished only by different parameters:


URL Description Static, non-parameterized page URL uses  parameters category and color to deliver the same content as a non-parameterized page. URL includes parameters to limit the number of results,  and a session ID for the user to show the same content.


If you have many such URL parameters in your site, then you might benefit by using the URL Parameters tool to reduce crawling of duplicate URLs. Important: If your site serves duplicate content to different URLs without using parameters, you should define a canonical page rather than block crawling, as described in this page.


Block crawling of URLs containing specific parameters


You can prevent Google from crawling URLs that contain specific parameters, or parameters with specific values, in order to avoid crawling duplicate pages.




You should use the URL Parameters tool only if your site fulfills ALL of the following requirements.


  • Your site has more than 1,000 pages, AND
  • In your logs, you see a significant number of duplicate pages being indexed by Googlebot, in which all duplicate pages vary only by URL parameters (for example: and


Incorrect usage warning You should use the URL Parameters tool only if your site fulfills the requirements above, and you are an experienced SEO. Using the URL Parameters tool incorrectly can cause Google to ignore important pages on your site, with no warning or reporting about ignored pages. If this sounds a little dire, it’s because many people misuse the tool, or use it unnecessarily. If you are unsure whether you are using this tool correctly, it’s better not to use it.




You can specify Google’s behavior when crawling your site with specific parameters. Parameter behavior applies to the entire property; you cannot limit crawling behavior for a given parameter to a specific URL or branch of your site.


To use the URL Parameters tool:


  1. Verify that your site meets the requirements listed previously.
  2. Open the URL Parameters tool.
  3. Either Edit an existing parameter or click Add parameter to create a new one. Note that this tool is case-sensitive, so type your parameter name exactly as it appears in your URL.
  4. Specify whether your URL parameter affects page content:
  5. No: Doesn’t affect page content: Your parameter doesn’t affect how page content is presented. This type of parameter might be used to track visits and referrers, but has no effect on the actual content of the page. For example, sessionID or userName. If Google finds many URLs that differ only in this parameter value, it will crawl one of them. Google tries to detect these types of parameters, but if your logs indicate that we are not identifying this static parameter correctly, you can specify it here.
  6. Yes: Changes, reorders, or narrows page content:  Your parameter can change page content. Examples might be brandgendercountry, or sortorder. Choose the purpose of the parameter:
  7. Sorts (for example, sort=price_ascending): Changes the order in which content is presented.
  8. Narrows (for example, t-shirt_size=XS): Filters the content on the page.
  9. Specifies (for example, store=women): Determines the general class of content displayed on a page. If this specifies an exact item, and this is the only way to reach this content, should select “Every URL” for the behavior.
  10. Translates (for example, lang=fr): Displays a translated version of the content. If you use a parameter to show different languages, you probably do want Google to crawl the translated versions using hreflang to indicate language variants of your page rather than blocking content with this tool.
  11. Paginates (for example, page=2): Displays a specific page of a long listing or article.
  12. Which URLs with this parameter should Googlebot crawl? Choose an option to indicate Google’s behavior when encountering URLs that contain this parameter:
  13. Let Googlebot decide: This setting is the default for already-known parameters. Select if you’re unsure of a parameter’s behavior, or if the parameter behavior changes for different parts of the site. Googlebot can analyze your site to determine how best to handle the parameter.
  14. Every URL: Tells Google never to block URLs with this parameter. URLs with unique values of this parameter do not contain duplicate content. For example, after you implement this type of setting for URLs containing the productid parameter, Google automatically considers the URL to be entirely different from because each URL has a different productid parameter value.
  15. Only URLs with value: Tells Google to crawl only URLs where your URL parameter is set to a specified value. URLs with a different parameter value won’t be crawled. This is particularly useful if your site uses the parameter value to change the order in which otherwise identical content is displayed. For example, contains the same content as You could use this setting to tell Googlebot to crawl only those URLs where sort=price_low to avoid crawling the duplicate content.
  16. No URLs: Tells Google not to crawl any URLs with a specific parameter. Google won’t crawl any URLs containing the parameter you entered. For example, you can tell Google not to crawl URLs with parameters such as pricefrom and priceto (like to prevent unnecessary crawling of duplicated content already available from
  17. If your site uses multiple parameters in a URL, see managing URLs with multiple parameters.
  18. Note that your rules might be inherited by other properties (see Inheritance of parameter rules).


Inheritance of parameter rules


If you have separate properties for http and https, or separate parent and child properties (for example, and, or and then your parameter settings might be inherited between properties, according to these rules:


  • http/https: If only one of your http or https properties has rules, then the rules are applied to both. If both your http and https properties have their own rules defined, then only their own rules are applied.
  • Parent/child: If a parent property ( has parameter rules, any child property ( without parameter rules inherits those rules; any child property with parameter rules uses only its own rules. Note that subdomains ( count as children of parent domains (


Managing URLs with multiple parameters


A single URL can contain many parameters; you can specify crawling settings for each one separately. If a single URL contains multiple managed parameters, Google will obey the following rule when deciding whether to crawl the URL:


The more restrictive parameter settings override the less restrictive parameter settings.


For example, below are three URL parameters and their respective Google crawling settings:


Parameter Parameter crawl settings
shopping-category Crawl all URLs with this parameter
sort-by Crawl only URLs with value = production-year
sort-order Crawl only URLs with value = asc


Example 1


Google won’t crawl this URL because the sort-by parameter is not set to production-year even though the URL contains a valid  sort-order value (asc)


Example 2


Google can crawl this URL because the sort-by and sort-order values match the allowed settings.


Example 3


Google can crawl both URLs because they don’t have any flagged parameters.


International Targeting report


Use the International Targeting report to monitor your hreflang errors, or to choose a country that should be prioritized for your search results. This report has the following sections:


  • The Language section: Monitor the usage and errors of hreflang tags on your site.
  • The Country section: Set a site-wide country target for your entire site, if desired.


Open the International Targeting ReportTo learn about different mechanisms for targeting your site at specific regions or languages, read the Multi-regional and multilingual sites topics.


Language tab


The Language section of the International Targeting page shows the following hreflang errors in your site. Up to 1,000 errors can be shown.


Error Description
No return tags An hreflang tag has no matching tag on the target page. The table aggregates missing return tags by implementation and locale for the following implementations:Page-level: The total number of hreflang errors by language in the <head> section of your pages. The details page shows a maximum of 1,000 URLs on your site, paired with the alternate-language page that’s missing a return tag to its mate. Sitemap: The total number of hreflang errors found in your sitemap. The details page lists the sitemap that indicates the URL pairing and the alternate URL that has no return link. HTTP headers: The total number of hreflang errors for alternate files provided in your HTTP header configuration. The details page lists the configuration and alternate URLs with no return link.
Unknown language code For unknown language (and optional country) codes that you have indicated in your site, the table displays the locale followed by unknown language code.  As with the no return tag error, you can drill down to see URL-level details and total counts of unknown language codes for that specific locale.


Click on a row to inspect error details, with a maximum of 1,000 rows.


Country tab


Google Search returns the most relevant and useful sites for a user. Because of this, search results can differ between a user in Ireland and a user in France.


If your site has a generic top-level domain, such as .com or .org, you can help us determine which countries are most important to you. If your site has a country-coded top-level domain (such as .ie or .fr) it is already associated with a geographic region (in this example, Ireland or France). If you use a country-coded domain, you won’t be able to specify a geographic location. You can specify a target country in the International Targeting report.


Setting a country target


  1. On the International Targeting report, click the Country tab.
  2. Check the Geographic target checkbox and choose your country target. If you want to ensure that your site is not associated with any country or region, select Unlisted in the drop-down list.


This setting is only for geographic data. If you’re targeting users in different locations—for example, if you have a site in French that you want users in France, Canada, and Mali to read—don’t use this tool to set France as a geographic target. A good example of where it would be useful is for a restaurant website: if the restaurant is in Canada, it’s probably not of interest to folks in France. But if your content is in French and is of interest to people in multiple countries/regions, it’s probably better not to restrict it.

Leave a Reply