What this robots.txt report means
This robots.txt report for redcacti.com shows whether the file is reachable, whether it declares a sitemap, and whether any crawl rules look likely to reduce search discovery. If the file blocks important sections or omits key crawl signals, search engines may have a harder time finding and revisiting the right pages.
The most important metrics here are warning count, sitemap directives, user-agent groups, and the actual Allow or Disallow rules being served. Those tell you whether the current file is acting as a clean crawl-control layer or introducing unnecessary crawl friction.
Summary
Warnings
No high-risk issues were flagged in the robots.txt file.
Rules and Sitemaps
Sitemap directives
- https://redcacti.com/sitemap-index.xml
Sample rules
| Directive | User-agent | Value |
|---|---|---|
| Allow | * | / |
| Disallow | * | /app/ |
| Disallow | * | /auth/ |
| Disallow | * | /dashboard/ |
| Disallow | * | /api/ |
| Disallow | * | /_assets/ |
| Allow | redCactiBot | / |
What to fix first
1. Remove high-risk crawl blocks
If User-agent: * is blocked from the whole site or important sections, fix that before anything else.
2. Add or correct the Sitemap directive
If no sitemap is declared, add the canonical sitemap URL to improve crawl-path clarity.
3. Re-run a broader audit
After fixing robots.txt, run a sitemap audit to check whether blocked discovery has already led to broken crawl paths or missing metadata issues elsewhere.