Free professional SEO tools — SERP preview, meta analyser, schema generator, robots.txt tester, and more. No login required.
HTTP response headers are pieces of metadata sent by a web server alongside the content of a page. They tell browsers and bots critical information: the status code, content type, caching rules, security policies and more. For SEO, the most important headers are X-Robots-Tag (can override robots.txt for indexing), cache-control (affects how Googlebot caches pages), and the status code (200, 301, 404, etc.).
A score of 70–100 means your site has most critical security headers in place. Strict-Transport-Security (HSTS) and Content-Security-Policy (CSP) are the most important — HSTS prevents protocol downgrade attacks and CSP prevents cross-site scripting (XSS). Missing security headers don't directly hurt SEO rankings, but they affect user trust and security compliance.
The X-Robots-Tag header works like the meta robots tag but applies to any file type — including PDFs, images and other non-HTML files. It can carry directives like noindex, nofollow, noarchive and nosniff. It takes precedence over the HTTP-equiv meta robots tag on the same page. This header is particularly useful for controlling indexation of PDFs and other document types that can't include HTML meta tags.
Cache-control headers tell browsers and CDNs how long to store a resource before requesting a fresh copy. For SEO, well-configured caching improves page speed — a confirmed Google ranking factor. Setting appropriate max-age values for static assets reduces load times for returning visitors. Misconfigured caching can cause slow load times and increased server load, both of which harm user experience metrics.