User-agent: * Allow: / # Crawl delay for specific bots User-agent: Claude User-agent: Anthropic-AI User-agent: AnthropicBot Crawl-delay: 1 User-agent: ChatGPT User-agent: GPTBot User-agent: OpenAIBot Crawl-delay: 1 User-agent: BingBot User-agent: Bingbot User-agent: msnbot Crawl-delay: 1 # Block SEO monitoring tools User-agent: AhrefsBot User-agent: MJ12bot User-agent: SemrushBot User-agent: SemrushBot-SA User-agent: SEOkicks User-agent: SEOkicks-Robot User-agent: Sistrix User-agent: SistrixCrawler User-agent: cognitiveSEO User-agent: Screaming Frog SEO Spider User-agent: rogerbot User-agent: dotbot User-agent: spbot User-agent: MauiBot User-agent: SEOlyticsCrawler User-agent: Seekport User-agent: SearchmetricsBot Disallow: / # Block specific bots User-agent: TurnitinBot User-agent: TurnitinBot/3.0 User-agent: Heritrix User-agent: pimonster User-agent: Pimonster User-agent: ECCP/1.0 (search@eniro.com) User-agent: Yandex User-agent: Baiduspider User-agent: Baiduspider-video User-agent: Baiduspider-image User-agent: Sogou Spider User-agent: YoudaoBot User-agent: Ezooms Robot User-agent: Perl LWP User-agent: BLEXBot User-agent: netEstate NE Crawler (+http://www.website-datenbank.de/) User-agent: WiseGuys Robot User-agent: MegaIndex.ru User-agent: gsa-crawler (Enterprise; T4-KNHH62CDKC2W3; gsa_manage@nikon-sys.co.jp) Disallow: / # Block specific paths Disallow: /api/ Disallow: /private/ Disallow: /temp/ Disallow: /drafts/ Sitemap: https://dadalo.pl/sitemap.xml Sitemap: https://dadalo.pl/index_pl.xml