Any way to have domain specific equiv of robots.txt

Hi,

I have a site that wants to block indexing by google images

http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35308

The recomended way is via robots.txt but in a multisite can that be applied to just one domain?

Is there an equivalent of a ‘virtual’ robots.txt or any other methods?

Your help appreciated, as always.