Can a Robot.txt or other bot instruction be created for Status Plugin?

Google recently rejected submission of my site as a news site. Although there is no way to be certain of the reason, in reading the background materials, it seems like the rejection might have been because I supplement original content on the site with excerpts and links to other articles using the Status Plugin, i.e., "aggregated content." The following paragraph in Google's news sites submission guidelines jumped out at me:
"If your site publishes aggregated content, you will need to separate it from your original work, or restrict our access to those aggregated articles via your robots.txt file."

Is there a way to do with this with the Status Plugin posts? Better yet, can this be done with a feature on the Status Plugin?