# robots.txt – Website Reactor # https://websitereactor.com/robots.txt # Last updated: 2026-02-23 # ── General crawlers ───────────────────────────────────── User-agent: * Allow: / Disallow: /private/ Disallow: /logs/ Disallow: /assets/js/ Disallow: /contact/?* Crawl-delay: 2 # ── Search engines (no crawl delay needed) ─────────────── User-agent: Googlebot Allow: / Disallow: /private/ Disallow: /logs/ User-agent: Bingbot Allow: / Disallow: /private/ Disallow: /logs/ # ── AI / LLM crawlers (welcome – content is intentionally AI-readable) ── User-agent: GPTBot Allow: / User-agent: ChatGPT-User Allow: / User-agent: ClaudeBot Allow: / User-agent: Claude-Web Allow: / User-agent: anthropic-ai Allow: / User-agent: Google-Extended Allow: / User-agent: PerplexityBot Allow: / User-agent: CCBot Allow: / # ── Known scrapers / spam bots ─────────────────────────── User-agent: AhrefsBot Disallow: / User-agent: SemrushBot Disallow: / User-agent: DotBot Disallow: / User-agent: MJ12bot Disallow: / User-agent: BLEXBot Disallow: / # ── Sitemaps ───────────────────────────────────────────── Sitemap: https://websitereactor.com/sitemap.xml # ── Additional machine-readable context ────────────────── # See /llms.txt for a structured summary intended for large language models. # See /sitemap.xml for a full list of crawlable URLs.