[SMART CRAWL] Adding SSL creates issue with Robots.txt

We added an SSL to our site last week, and ever since, we seem to have intermittent issues.
Browsershots.org says it cannot access the robots.txt
1. Could this be due to an http redirect to the unsecure version of the domain? If so, what is the correct settings to make?
2. What impact will the SSL have on robots.txt and do I need to put any special rules inside of the robots.txt file to account for the http redirect in place, currently it just blank?
3. The site is hosted on a Godaddy Managed Wordpress Deluxe Hosting.
4. Plugin called REALLY SIMPLE SSL is installed;
5. Plugin called SMART CRAWL is installed
6. Google Webmaster Tools Robots.txt tester says there are no issues with robots.txt, but that is not correct?
Thanks for your help!