Download the server‘s robots.txt, and return try if we are allowed to acces the url, false otherwise
Return the value of the Crawl-Delay directive, or nil if none
Sleep for the amount of time necessary to obey the Crawl-Delay specified by the server
[Validate]