- author: Google Search Central
Troubleshooting Google Search Issues
Introduction When it comes to website visibility on Google, it's essential to address any issues that might prevent your website from appearing in search results. In this article, we will explore common questions and concerns raised by webmasters and provide practical solutions to help improve their website's presence on Google.
Issue 1: Website not Appearing on Google Search
Shimmy approached us with a concern about their website not appearing on Google search. Upon inspecting the site, we discovered that it was returning a 403 HTTP status code when crawled by Googlebot with a smartphone user agent. This HTTP status code signifies that there is nothing to index on the website. To verify this issue, Shimmy can use the URL inspection tool in Google Search Console, which provides detailed insights into how Google crawls and indexes web pages. Resolving the 403 error and ensuring that the website is accessible to Googlebot will help improve its visibility on Google search.
Issue 2: Managing 404 Errors and 301 Redirects
Gary from the Search team was approached by a user who wanted to know whether having millions of 404 error pages or millions of 301 redirects would be more detrimental to their website. Gary explained that both 404 errors and 301 redirects are harmless from an indexing perspective. The user needs to evaluate their specific scenario and choose the approach that best suits their needs. Whether it's redirecting expired product pages to parent listing pages or showing custom 404 pages, the user should focus on what aligns with their website's objectives.
Issue 3: URL Format and Declared Canonical for E-commerce Sites
Martin answered a question from a user named Senior, who wanted to know the most optimal URL format and declared canonical for foreign e-commerce site category pages. Martin emphasized that canonicalization depends on the site owner's recommendation through the use of canonical tags and how well a URL is linked from other sources. While shorter URLs are preferred, choosing what to canonicalize is at the owner's discretion. Martin also reassured that the choice of client-side or server-side rendered applications does not impact indexing or canonicalization.
Issue 4: Displaying Multiple Currencies in Search Results
John encountered an issue where only the prices in US dollars were appearing in his search results, despite his site offering other currencies. After examining John's website, we found that it dynamically adjusts the currency and price based on the user's location. As Google primarily crawls from the US, it only sees the prices in US dollars. To display multiple currencies in search results, John should establish separate URLs for each currency and ensure that the corresponding price is shown to users based on their region.
Issue 5: Resolving Domain Name and Hosting Concerns
A user named Vandolin faced a dilemma with their purchased domain name returning a 404 error. Gary explained that for a domain to appear on Google search, it needs to be hosted by a hosting company and associated with the location of the content. Popular hosting platforms such as WordPress, Wix, or Blogger can be used for this purpose. Once the domain is linked to the content, publishing high-quality content and promoting it will eventually lead to its discovery by Google.
Issue 6: Correct Canonical URL Selection for Different Domains
Luisa raised concerns about Google selecting incorrect canonical URLs from a different country domain, despite correctly assigning hreflang attributes. Martin clarified that canonicalization and hreflang are interrelated but separate concepts. Hreflang informs Google that pages are identical but in different languages or regions, while canonicalization determines the primary URL in Google's index. In certain cases, country-specific pages with similar content may appear in search results even if a different domain is designated as canonical. Allowing users to choose their preferred region can help address any selection discrepancies.
Issue 7: Dealing with Unwanted Links in Non-native Languages
Mohammad discovered that his website had unwanted links written in Chinese, which he wanted to permanently remove. We explained that if these links are publicly accessible on Chinese language websites and undergo crawling and indexing, Google may include them in search results. Although the Disavow Links tool can be used to indicate these links should not be considered for ranking purposes, it does not remove them from the web or Google's systems. As long as the links are non-spammy and do not violate guidelines, they can be ignored.
Issue 8: Changing Website Address from Old to New Domain
A user sought advice on how to change their website address from an old domain to a new one. We directed them to consult the numerous online guides available for site migration, including the comprehensive guide on developers.google.com/search. These resources provide step-by-step instructions for safely transitioning to a new domain without compromising website visibility on Google.
Issue 9: Creating Unique Content for Identical Product Pages
A user questioned how to make the content on product pages unique when multiple sellers offer the same product. Martin advised providing a distinctive perspective to differentiate the content. This could include genuine reviews, personal tests conducted on the product, or additional information that goes beyond the manufacturer's default product descriptions and specifications. By offering valuable insights and unique perspectives, the content can stand out in search results.
Issue 10: Segregating Google Search Console Reports Based on Subdomains
John received a query about how to separate Google Search Console reports based on subdomains. He explained that Search Console allows users to verify and manage subdomains alongside the main domain without any additional verification steps. Once the main domain is verified, subsections like subdomains can be added effortlessly. While accessing subdomain-specific data might take a few days to populate in the reports, users can effectively monitor and analyze each subdomain's performance.
Issue 11: Impact of 404 Errors on Normal Status Pages
A user was concerned whether a large number of 404 error pages in Google Search Console would negatively affect the rankings of normal status (200) pages. This worry was unfounded as Google treats 404 errors as harmless from a ranking perspective. Normal status pages are not impacted by the presence of 404 errors. Hence, webmasters need not worry about a negative impact on their website's rankings due to 404 errors.
Issue 12: Returning to Google Discover
Users experiencing the loss of visibility in Google Discover were eager to find ways to regain their website's presence. However, Martin noted that Discover is an organic feature driven by user demand and habits. While webmasters can ensure their content is indexed and adheres to Google's content guidelines, the flow of Discover traffic is unpredictable. Consequently, webmasters cannot rely solely on increasing Discover traffic and should focus on delivering excellent content that aligns with user interests.
Issue 13: Schema Markup for Affiliate Sites
A user with an affiliate site inquired about the appropriate Schema markup type to indicate their affiliation with another website. Responding to this inquiry, John clarified that there is no specific Schema markup for affiliate sites. Instead, webmasters should prioritize providing helpful and user-centric content while clearly labeling affiliate links using the "rel=nofollow" or "rel=sponsored" attributes. This allows search engines to understand the purpose of the links and ensures compliance with guidelines.
Issue 14: Creating an Effective Robots.txt File
A user asked for advice on creating a robust robots.txt file. Gary emphasized that there is no universally good or generic robots
How to Create a Good robots.txt
When it comes to creating a robots.txt file, there is no one-size-fits-all solution. A good robots.txt file should be tailored to meet your specific needs and requirements. It can be whatever you need it to be.
For example, you may want to restrict the crawling of your internal search results. This is considered a good practice, as it helps prevent search engines from indexing irrelevant pages. On the other hand, you may want to allow crawling of a specific JavaScript file to ensure proper rendering and functionality.
The key is to understand your website and how it is set up. This will help you determine what rules to include in your robots.txt file. If you need guidance on creating robots.txt files, you can refer to the documentation available on developers.google.com/search.
How Google Uses Quality Raters to Review and Rate Sites
Google employs quality raters to review and rate websites, although their feedback does not directly impact search rankings. These raters provide valuable insights that help Google evaluate changes and improve search results. Quality raters assess individual websites according to specific guidelines.
If you are interested in learning more about the quality raters and their evaluation process, you can find more information at goo.gle/quality-raters.
Dealing with Redirects After a Site Migration
If you recently migrated your website and still see a large number of pages with redirects in Google Search Console, there is usually no cause for concern. Google's systems sometimes retain old URLs for an extended period of time. As long as these older URLs are properly redirecting to the new pages, it is not necessary to remove them from Google's index. Over time, these old URLs will naturally drop out of the search results.
It is important to note that any recent changes in search performance are likely unrelated to the site migration.
Addressing Website Visibility Issues
If your website domain is not appearing in Google search results, there are a few steps you can take to address the issue. Firstly, verify your domain in Google Search Console and look for any potential issues indicated there. Often, this process can help identify and resolve any problems affecting your website's visibility.
If your domain is used solely for email accounts and you do not want it to appear in search results, using a robots.txt file to block crawling can be a suitable option. However, blocking crawling does not guarantee that your pages won't show up in search results. To prevent indexing altogether, you may need to use a "noindex" tag or set the HTTP header to "noindex."
Mobile First Indexing and Website Migration
If you are wondering whether your website has transitioned to mobile-first indexing, you can check the crawler types specified in the crawl stats section of Google Search Console. If the smartphone Googlebot is the most frequently encountered crawler, then your website is already indexed for mobile devices. In general, Google now crawls and indexes websites primarily from their mobile versions. However, there are a few exceptions for sites that are not compatible with mobile devices.
The Impact of Query Strings in hreflang Annotations
Using query strings in hreflang annotations should not cause significant issues. However, it is important to be consistent to avoid any potential canonicalization surprises. While it is unlikely to have a major impact on ranking, keeping a consistent approach is always recommended.
The Influence of Website Form Design and Position on SEO Ranking
Forms do not have a direct impact on SEO rankings. They are treated as regular content by search engines. The design and position of forms are not factors that search engines consider when determining page rankings. Instead, focus on optimizing other aspects of your website, such as content quality, relevance, user experience, and technical performance, to improve your SEO rankings.
Title: Design Placement and Indexing Considerations for Websites
When it comes to the design of a website, the placement of content can play a role in how search engines perceive its importance. Additionally, certain design choices, such as using overlays for forms, can impact user experience on the site. In this article, we will discuss the various factors to consider when designing a website for optimal crawlability and indexability.
The Impact of Design Placement on Google
Google takes into account the placement of content on a website when determining its relevance and importance. While the actual design itself may not hold much significance, the positioning of content can provide Google with insights into the value it holds for the site. It's important to strategically place important content where search engines, like Google, can easily identify its significance.
Overlays and User Experience
If your website utilizes overlays for forms, it's important to be aware of their impact on user experience. While overlays can be useful for capturing information from users, they can also affect the overall user experience on your site. It's essential to strike a balance between capturing data and ensuring a seamless browsing experience for your visitors.
Crawling and Indexing Large Websites
Addressing a question from John about de-indexing and aggregating millions of used products into a smaller number of unique indexable pages, it's important to note that simply reducing the number of indexable pages may not necessarily improve website quality for search. Google recommends focusing on enhancing the overall quality of the website rather than solely reducing the number of pages.
Improving crawlability and indexability for large sites depends on various factors, including how well the website can handle increased crawling. It's recommended to refer to Google's documentation on the crawl budget for large sites to understand the best practices for balancing crawlability and indexability.
Using the .ai Domain as a Global Top-Level Domain (GTLD)
Gary responds to Danielle's question about whether a global company can use the .ai domain as a GTLD. As of early June 2023, Google treats .ai as a GTLD in its search results. This means that global companies can confidently use the .ai domain for their online presence.
Handling 404 Errors in Search Console
John addresses a query from Gan Shayam regarding clearing out old 404 errors in Search Console. Unfortunately, there is no setting to manually clear out these errors. Search Console automatically collects and updates these errors over time as your website is crawled. It's important to note that 404 errors are expected and not necessarily problematic. Websites commonly return 404 errors for non-existent pages, and it is considered a good practice.
Requesting Indexing on Google Scholar
A question from John asks about how to request indexing of a recent publication in a top journal on Google Scholar, without having his own website. While John mentions that he doesn't have a personal website, he refers to a profile page where the journal links his work. In response, John advises using alternative methods to find the indexed version of the publication. Searching for specific text from the article instead of relying solely on the URL can help locate the correct version. Additionally, ensuring that the page is accessible to users who are not logged in is crucial.
Redirect Best Practices for Various File Types
Gary answers Philip's question about whether best practices for redirects apply universally to web pages, PDFs, videos, and images. He confirms that the same redirect techniques used for web pages can be applied to other file types, such as PDFs, videos, and images. File type does not impact the way redirects are implemented.
Implementing Hreflang Tags on Sections of a Website
John addresses Toby's query about implementing hreflang tags on specific sections of a website, such as European folders, and whether it would still be effective as an international targeting signal. John clarifies that hreflang is done on a per-page basis, and it is acceptable to use them selectively on different parts of a website. However, he advises verifying the need for hreflang implementation before investing significant time into the process. Implementing hreflang annotations through a sitemap file is also an option if editing individual pages is not feasible.
In conclusion, the design placement and indexability of content play significant roles in how search engines perceive a website's value. By strategically positioning content, considering user experience, and following best practices for crawlability and indexing, website owners can optimize their online presence for improved search visibility.