I have the strangest problem i have ever had with a robot.txt file.
First off some background:
I built the site on a development server starting with the option of discourage search engines from crawling your site.
once i moved it live i changed the option. google still sees the robots file as blocking everything.
next i manually added the robots.txt with the proper allow code, google still sees the robots file as being blocked.
I even installed the robot.txt wordpress plugin and set that up to allow all, still doesnt work
my question is, how does the robots file work in wordpress? Where do i edit it at the code or db level? how can i check if my file is working??? please help this is killing our site!