I usually hit robots.txt when I want to make fetch requests to a domain from the console without running into CORS or CSP issues. Since it's just a static file, there's no client-side code interfering, which makes it nice for testing. If you're hunting for vulnerabilities it's also worth probing (especially with crawler UAs), since it can leak hidden endpoints or framework-specific paths that devs didn't expect anyone to notice.