There is a technical problem that appears quite often and that causes a lot of confusion: the sitemap is well built, the server is responding correctly, Google downloads it, and yet Search Console shows an error. The pages listed in that sitemap are never crawled.
That's exactly what a Reddit user reported a few days ago. Google's John Mueller answered, and his answer says a lot more than it seems at first glance.
A sitemap is a file that tells Google what pages exist on your site. It's basically an index: a list of URLs that you want the search engine to know and consider in order to index.
It's not mandatory to have it, but it's useful, especially on large sites or on new sites where Google hasn't yet discovered all the pages through internal links. The sitemap doesn't guarantee that Google will index the pages listed, but it does make it easier for you to find them.
Technically, a sitemap is an XML file that lives on your server. When properly configured, Google can download it, read the URLs, and use them as a starting point for crawling the site.
The situation was as follows: the sitemap met all the technical requirements. It returned a 200 code, it had a valid XML structure, indexing was allowed. The server logs confirmed that GoogleBot had successfully downloaded the file.
Even so, Search Console was showing the “Couldn't fetch” error since December 2025, several months in a row. Manually submitted pages were crawled. Not the ones in the sitemap.
Mueller responded with something worth reading carefully:
“One part of sitemaps is that Google has to be keen on indexing more content from the site. If Google's not convinced that there's new & important content to index, it won't use the sitemap.”
In other words: Google downloaded the sitemap, but decided not to use it. And the reason was not technical.
Mueller's answer involves something that many teams aren't completely clear about: Google doesn't crawl and index everything they find. They have limited tracking resources and distribute them based on their own assessment of which sites and what content deserve attention.
When Mueller says that Google needs to be “convinced that there is new and important content”, he is describing a quality filter. If Google doesn't see signs that the site's content is relevant or valuable, it won't invest resources in crawling more pages, even if the sitemap is technically perfect.
There are two variables that Mueller specifically mentions: that the content is new, and that it is important. They are different things.
Google favors sites that publish with some regularity. A site that hasn't updated content for months, or that posts very little, creates less incentive for GoogleBot to come back often. If the sitemap hasn't changed significantly for months, Google may simply have no reason to process it as a matter of priority.
This is the broadest part of Mueller's answer, and it's deliberately vague. “Important” can mean several things, and not all of them imply that the content is bad.
Sometimes the content is technically correct but thin: few words, little depth, little to bring to the user beyond what already exists on thousands of similar sites. Google calls this “thin content” and treats it as a negative signal.
Other times the content isn't bad, but rather incomplete for what the user needs. It lacks a step-by-step, or an explanatory image, or a comparison, or a concrete example. The text exists but it doesn't solve the user's question well.
It can also be a problem of uniqueness: if the content is very similar to what already exists on other sites, or if there are duplicate pages within the same site, Google may not consider that it is worth indexing more of the same.
The sitemap in this case is a symptom, not the real problem. The real problem is that Google doesn't trust the site enough to want to index more content from it.
That has implications that go beyond the XML file. A site in that situation probably also has:
The sitemap is where the symptom becomes visible, but the cause is in how Google evaluates the overall quality of the site.
The natural impulse when there are indexing problems is to send more URLs, to review the sitemap, to force crawls. That doesn't solve the underlying problem.
What does help is to do an honest audit of the content that is already indexed. The relevant questions are: do these pages solve what the user is looking for well? Are they deep enough? Are there pages that are very similar to each other that could be consolidated?
Mueller says it explicitly: the way to identify what to improve is to think like a visitor to the site. What this page lacks to be truly useful. Sometimes it's more text, sometimes it's an image, sometimes it's a concrete example, sometimes it's simplifying what already exists.
Technical SEO matters, but Google is evaluating whether content deserves to be shown to real users. Optimizing for that question is more effective than optimizing for the crawler.
If the site has many pages with little content, or pages that cover very similar topics without differentiating each other, consolidating that content usually improves how Google evaluates the site in general. Fewer higher-quality pages tend to perform better than many mediocre pages.
If the site hasn't been updated for a while, resuming a regular publishing schedule gives Google reasons to return. It's not necessary to post every day, but it does need to be posted often enough so that the site doesn't look abandoned.
Mueller's response confirms something that the teams of SEO Experienced people already know but that's not always obvious to larger marketing teams: Google is not a neutral system that indexes everything it finds. It's a system that makes active decisions about what deserves attention and what doesn't.
Those decisions aren't just made page by page. They are taken at the site level. A site with a good track record of useful and relevant content will make it easier for new pages to be crawled and indexed. A site where Google hasn't seen clear signs of quality is going to have to earn that trust first.
The sitemap is a communication tool with Google. But like any communication tool, it works best when what you have to say is worth listening to.