Robots.txt Basics for New Sites
A straightforward introduction to basic crawl rules, disallow paths, and sitemap references for small sites.
Start simple
For most new sites, the best robots.txt file is a small, understandable one. You usually need a clear allow rule, a few disallow paths, and a sitemap reference.
Do not overcomplicate crawler control early
A generator is most useful when it helps you create a clean starting file, not when it tries to simulate every possible crawler policy edge case.