Fabrice Canel, the Principal Program Manager of the Bing Index Generation team, posted their Sitemaps best practices guide for large web sites.
Bing says they can support up to 125 trillion links through multiple XML sitemap files. With one sitemap file, Bing allows you to list 50,000 x 50,000 links, which brings you to 2,500,000,000 links (2.5 billion). If you need more URLs, then Bing allows you to use 2 sitemap index files, which can then lead to the 125 trillion number.
Bing however recommends you don’t list so many URLs. Rarely will Bing index all those URLs, so just list the URLs that are important to the site.
The total size of sitemap XML files can reach more than 100GB. For really large sites, Bing recommends you take things slow:
To mitigate these issues, a best practice to help ensure that search engines discover all the links of your very large web site is that you manage two sets of sitemaps files: update sitemap set A on day one, update sitemap set B on day two, and continue iterating between A and B. Use a sitemap index file to link to Sitemaps A and Sitemaps B or have 2 sitemap index files one for A and one for B. This method will give enough time (24 hours) for search engines to download a set of sitemaps not modified and so will help ensure that search engines have discovered all your sites URLs in the past 24 to 48 hours.
Check out the post for all the details but keep in mind, Google’s best practices may differand some of these points and it isn’t always recommended to create search engine specific Sitemap files.
Forum discussion at WebmasterWorld.