Good morning from sunny Tucson!
Broken links are being worked on as we speak...lots of them. However...my robot.txt file IS simply this : User-agent: *
BUT 787 links were reported restricted BY ROBOT>TXT. On March 1 -and now I have changed the robot.txt file. But that file now indicated there should be NO restricted files. Right? So why doesnt google UPDATE the result? - by removing it?
Am I to ASSUME that if Google finds crawl errors a week ago and that error stays the same date...then that error is now fixed? Is there some way to ask the google bot to update all the files for all the errors..oh i guess I am asking for a GOOD OR BAD answer..instead the answer is ... well, it WAS BAD 6 days ago....very frustrating. to say the least.
Comments and help please?