- author: Google Search Central
Are you expanding your website into the EU and wondering about the subdirectory and hreflang annotations? Or perhaps you have questions about failed robots.txt files or deleting old websites from Google Search? Our experts have got you covered with their answers to these and other common SEO queries.
Using Subdirectories and Hreflang Annotations for Multilingual Websites
- You can use a subdirectory such as
website.com/eufor expanding your website into the EU.
- Hreflang annotations are per page, and you can apply multiple annotations to the same page.
- Specify the list of countries that apply to that page in English or use a language EN and mark it as a generic English version.
- Back up hreflang annotations with a dynamic banner that displays for users from the wrong country to guide them to the best possible experience.
Checking the Quality of Indexed Posts
- Google doesn't index every single URL on the internet, as it would not be feasible.
- Use Google Search Console's inspection tool to check if the URLs are accessible to Googlebot.
- URLs that Google tends to index are those with high-quality content.
- Check out Google's documentation on content quality on developers.google.com/search.
Using Product-Specific Reviews and Ratings for Rich Results
- For products, it is okay to use product-specific third-party review and rating data for Rich Results as long as the reviews are visible on the page somewhere.
- Ensure that people can easily find and read reviews about that product and that the reviews are relevant to that product page.
- Product reviews should be about an individual product and not a category of items.
- Google's documentation on review snippets has more information on this.
Handling 404 Errors for Deleted Pages
- It may not be necessary to redirect deleted pages to the homepage.
- Consider whether it would make sense for a user to land on the new homepage instead or whether it would be confusing.
- Sometimes, a 404 is the best course of action if there's not a good replacement for the old landing page.
- If you're not sure, check out Google's documentation on content quality.
Redirecting Old Sites to New Domains
- When moving to a new domain, it's best to redirect the old site to the new one rather than deleting it to preserve the valuable signals it has collected over time.
- It may take a few weeks or months to see all your old URLs replaced by the new ones in search.
- Deleting your old site entirely can cause problems with ranking and quality assessment.
Dealing with Accidental Creation of Irrelevant URLs
- Our systems try to recognize and handle incidents where a ton of irrelevant URLs are created accidentally.
- This may temporarily increase crawling, resulting in a higher load on the server until things quiet down again.
- Having a lot of unindexed URLs is fine and won't cause any problems for search.
Setting Up Serving Conditions for SEO-Irrelevant URLs
- Cloaking is a bad idea when you have multiple serving conditions as something will eventually go wrong, causing your site to fall out of search results.
- Instead, remove URLs from search by adding a noindex robots meta tag to the specific pages.
- It's much easier and safer than setting up weird serving conditions.
Using Status Codes for Redirects
- Consider using 301 or 308 status codes for redirects instead of other serving conditions.
- Both codes are strong signals that the redirect should be canonical.
- Googlebot treats 308 as equivalent to 301.
Diverse Reasons for Slow Indexing
- How much of your website Google indexes and how fast depends on how much of your site Googlebot can access and the quality of the content on your pages.
- The higher the quality, the more your site Google might index.
- Slow indexing may also be due to issues with inaccessible or low-quality content.
- Check out Google's documentation on indexing and content quality for more information.
Troubleshooting Websites Not Appearing for Google Searches
- If your website is not appearing for Google searches, use Google Search Console to troubleshoot.
- Check if Googlebot can access your site and whether your content is of high quality.
- If you're not sure what to do, check out Google's documentation or enlist the help of SEO professionals.
Expert Advice on Google Search: Quality Content and Structured Data
As website owners strive to improve their search engine rankings, they often turn to Google for advice and guidance. In this article, we'll review some expert advice from Google's own John Mueller, Gary Illyes, and Lizzie Sassman, covering topics such as quality content, structured data, and dealing with issues like disappeared rankings and hacked sites.
Quality Content and Indexing
One of the key factors in getting your site indexed by Google is having high-quality content. According to Mueller, the better the quality of your content, the more likely it is that Google will index your site. He suggests checking out Google's documentation for more information on quality content.
Mueller also advises website owners like Melissa who are not seeing their site appear in Google searches to encourage others to mention their site and to use Google's Search Console to submit sitemap files and individual pages for indexing.
Structured Data and Rich Results
Sassman addresses a question about missing structured data snippets for "ratingValue" and other properties on pages with zero reviews. She explains that this is normal for new product pages and that having zero reviews will not hurt your SEO. She further notes that if there is no rating, Google won't show any star information.
Illyes weighs in on optimizing recipe structured data for Rich Snippets. He stresses the importance of adding required properties like image and name of the recipe. Optional properties can be left blank, but they might limit Google's ability to show additional enhancements. Illyes also notes that it's ok not to have a name and image for every step, depending on the complexity of the recipe.
Ranking Issues and Hacked Sites
Illyes offers guidance for website owners like Camila who experience disappeared rankings for a particular keyword. He suggests checking with remote friends to see if the issue is global, reviewing recent changes to the site, and considering whether actions like changing link structure and page layout, acquiring more links, or using the disavow tool might have impacted rankings.
Mueller addresses a webmaster who submitted a request to update a page but was denied. He explains that the cache update tool is for removing specific text, and if the text in question is no longer indexed, the tool won't work.
Finally, Illyes cautions website owners who see spam in their site's description that their site might have been hacked. He advises seeking help from resources like web.dev to clean up and secure the site.
By following this expert advice from Google, website owners can improve their content, optimize for structured data and Rich Results, and address issues that might impact their rankings or site security.