machina-policy supports the following basic elements of robots.txt files:
- Allow: lines
- Disallow: lines
- URL globbing (like Googlebot: * is a wildcard, $ is a terminating anchor)
- Crawl-delay (actually obeying crawl-delay is up to you)
- Defaulting to User-agent: * if specified user-agent not found
URI, MIT-LICENSE