Bot intended sitemaps are files created to tell search engines, like Google and Bing, about the pages on your website and how they are organized. web crawlers, like the Googlebot, are able to read these files and more thoroughly crawl your site. These sitemaps can contain valuable metadata associated with your site’s pages.
Metadata allows the bot to know information like when the page was last updated, how often each page is updated and the importance of the page relative to other pages on your site. Metadata can also be used to give information about specific types of content on your pages, including mobile view, images, and video.
These types of sitemaps are intended to be submitted to the search engines through webmaster tools or analytics account.
These type of sitemaps are limited to 10MB and 50,000URLs. If you have a site larger than this, you can break your sitemap up into smaller files and include a sitemap index file with your submission of the sitemap to search engines.
Utilizing the sitemap.org sitemap protocol, XML sitemap files is the easiest way to create and submit your sitemap. Within WordPress, plugins like Yoast SEO, will automatically create these files for you.
RSS, mRSS, and Atom 1.0
If you run a blog with an Atom or RSS (Real Simple Syndication) feed, you can submit your feeds URL in place of a site map. You can also submit the mRSS (media RSS) feed for video content on your site.
If you aren’t sure how to create XML files or RSS feeds, Google does allow you to submit a txt file to them that has each URL on your site in the file with 1 URL per line, and nothing else in the file.
Find out more about building a sitemap from Google