Robots.txt Check for redcacti.com

This report verifies whether robots.txt is accessible, whether it includes sitemap directives, and whether any crawl rules could block important sections from routine discovery.

Checked 4/2/2026 11 views 2 days ago

What this robots.txt report means

This robots.txt report for redcacti.com shows whether the file is reachable, whether it declares a sitemap, and whether any crawl rules look likely to reduce search discovery. If the file blocks important sections or omits key crawl signals, search engines may have a harder time finding and revisiting the right pages.

The most important metrics here are warning count, sitemap directives, user-agent groups, and the actual Allow or Disallow rules being served. Those tell you whether the current file is acting as a clean crawl-control layer or introducing unnecessary crawl friction.

0
Warnings
2
User-agent Groups
1
Sitemap Directives
5
Disallow Rules

Summary

Robots URL
https://redcacti.com/robots.txt
HTTP status
200
Allow rules
2
Disallow rules
5

Warnings

No high-risk issues were flagged in the robots.txt file.

Rules and Sitemaps

Sitemap directives

  • https://redcacti.com/sitemap-index.xml

Sample rules

Directive User-agent Value
Allow * /
Disallow * /app/
Disallow * /auth/
Disallow * /dashboard/
Disallow * /api/
Disallow * /_assets/
Allow redCactiBot /

What to fix first

1. Remove high-risk crawl blocks

If User-agent: * is blocked from the whole site or important sections, fix that before anything else.

2. Add or correct the Sitemap directive

If no sitemap is declared, add the canonical sitemap URL to improve crawl-path clarity.

3. Re-run a broader audit

After fixing robots.txt, run a sitemap audit to check whether blocked discovery has already led to broken crawl paths or missing metadata issues elsewhere.