Should I Submit More Than 1 Sitemap to Google Webmaster Tools?

People I’d like to know whether it is advisable to submit multiple sitemaps to Google Webmaster Tools? And also is it advisable to submit the attachment-sitemap.xml sitemap? Thank you very much.

hi Kojo1987, Kindly refer following url. I hope it will help you. https://support.google.com/webmasters/answer/183668?hl=en

Why would you even have multiple site maps. A sitemap is a sort of “one source of truth” if you will. I’ve never heard of having more than one sitemap. What is your scenario?

As far as I know, it’s something you need only do for a very large site.

Yes, they are limited to 50,000 URLs or a filesize of 10MB. If your sitemap is bigger than that, or likely to get bigger, use multiple sitemaps.
If you do use multiple sitemaps, then also make a ‘Sitemap Index File’. That’s like a sitemap for your sitemaps, it lists all your sitemaps in one file.

Yes.If u had upload a sitemap from your server it’s shows the message like override.override it and submit the sitemap…Google will automatically consider the new sitemap…

I asked some SEO communities before about adding a second or updated site map and was told that it was unnecessary once google has initially index the site.

joseph

Google gives guidelines for how to submit an updated sitemap, but I imagine this is only useful to do if you’ve made major changes to the site. Most simple additions should be picked up when Google next crawls the site (unless there are problems with your navigation).

Updating sitemaps is a separate issue, you don’t give an updated sitemap in addition to an existing one, you replace the existing one.
I think keeping sitemaps up to date is probably a good idea if using ‘lastmod’ tags, so Google knows if your page has changed before crawling it, so can prioritise crawling updated pages. I don’t know this, but just assuming how it works.
Yes, if its indexed Google will already know your pages exist, but if the content has changed significantly, you want it re-crawling to keep it within relavant search results. Obviously if you add pages, then update.

If you want Google to crawl updated pages quickly, you can use “Fetch as Googlebot” to achieve that.