I use a wp seo plugin sitewide, it can handle robots.txt

I use a wp seo plugin sitewide, it can handle robots.txt on a per whatever basis.

I need a global addition, as I have asshat spiders crawling, and giving me 90,000 get requests an hour on stuff thats not cached.

I had found an article from James Farmer (sp) about setting up a MU plugin for robots.txt

And I am wondering if anyone has a better solution, or if this (very old) post is the correct way to go to disallow bots, set crawling frequency, etc.

https://premium.wpmudev.org/blog/wpmu-robotstxt-globally/

Please advise (not a canned response, discuss it - like days of yore! :slight_smile: