Sitemap Generator FAQ
Why does the Sitemap Generator not index any URLs of my site?
Noindex set for all pages
The Sitemap Generator is aware of robots (noindex) meta elements and does not list pages that are marked with the noindex attribute. I saw some websites in the wild, which have added the noindex attribute on each page. Please make sure that this is not the case for your website. Neither the Sitemap Generator nor a search engine will index your site if the noindex attribute is set globally.
Site crawler blocked
Another reason for a sitemap with no URLs could be that the crawler of the Sitemap Generator is blocked by your hosting provider. I have observed this issue especially with free and really cheap hosting providers. Some block crawlers (and regular visitors) already after five fast sequential requests. The issue could be fixed by whitelisting the IP of the crawler. However, I think this option is not available for the affected hosting services. Alternatively it is possible to use the crawl-delay directive in your robots.txt to set the delay between two requests.
Is it possible to filter the URLs which are listed in the sitemap?
The Sitemap Generator recognizes the noindex attribute if set on a page and respects your robots.txt file. It is thus possible to filter the results with these two mechanisms. A filter function in the plugin is not available, because it makes no sense in my opinion. If a page is not listed in a XML sitemap file, that means not that a search engine will not find it. Sooner or later the search engine finds and indexes the page. So the use of the noindex attribute and robots.txt are a clean solutions which is also respected by all serious search engines.
Which user-agent should I use in the robots.txt file?
The Sitemap Generator uses a custom user-agent group named MB-SitemapGenerator. This allows you a fine grained control of which pages are parsed and added to the sitemap. If you do not define a group for the custom user-agent, the default set in the * group apply.
How are images which are not embedded in a page handled?
Images which are only linked to directly and not embedded in a HTML page are listed in the image sitemap and not as normal URLs. There is sadly no specification about how to handle such images, but because images need some context to be evaluated correctly in this day and age, I think the image sitemap is the best place to put them.
How are embedded images from external domains are handled?
If you embed images from external domains on your website, they are listed in the image sitemap. So it's no problem if you deliver your images for example through a CDN services which is available under another domain. Please not that this is only true for embedded images and not if you directly link to images on other domains.
Does the Sitemap Generator work in my local development environment?
No, the Sitemap Generator needs to crawl your website and the generator has no access to you local network.
The Sitemap Generator is very slow. What can I do?
In the most cases this is due to the fact that you have set a large value for the crawl-delay directive in your robots.txt file. Some hosters also add the crawl-delay directive automatically to your robots.txt file. The crawl-delay defines the time in seconds between two requests of the crawler.