Lots of variables. But I had it throttled down to what I thought was a reasonable number of likely page requests within a given period. For whatever reason, it was apparently too limiting in some cases. To help you understand why, there are people and groups that crawl the site to steal content, and also make DOS (denial of service) attacks that either slow things to a crawl or cause the server to be overloaded and the web site becomes unaccessible. And then of course malicious actors that are trying to find every weakness to access the file structure or databases. The worst case I’ve had to deal with was an apparent breach of the actual host server which allowed malware to be injected. That’s the one that kept me up 2 days straight. I also had the site offline to protect everyone. Anyway, TMI but good to be aware of. It takes a lot to properly run a web site.