Free professional SEO tools — SERP preview, meta analyser, schema generator, robots.txt tester, and more. No login required.
An XML sitemap is a file that lists all the important URLs on your website in a structured format, helping search engines discover and prioritise your pages for crawling. Most websites benefit from having one — especially sites with more than 50 pages, new sites with few inbound links, or any site where fresh content needs to be indexed quickly.
Include your most important, publicly accessible, indexable pages: homepage, service pages, product pages, blog posts, location pages and category pages. Exclude: pages blocked by robots.txt, pages with noindex tags, duplicate pages, login/account pages, 404 error pages and redirect URLs. Quality over quantity — a leaner sitemap with strong pages is better than a large one with thin content.
Changefreq hints to crawlers how often content changes. Priority (0.1–1.0) signals relative importance within your site. Important caveat: Google largely ignores both of these attributes. LastMod (last modified date) is the attribute Google pays most attention to — keep it accurate and update it when content genuinely changes.
Submit via Google Search Console under Sitemaps — paste your sitemap URL (e.g. https://example.com/sitemap.xml) and click Submit. You can also reference your sitemap in robots.txt by adding 'Sitemap: https://example.com/sitemap.xml' — Google discovers this automatically.