URL crawler 200 error code

URL crawler from Smartcrawl is reporting issues with 200 URLs and mentions error code: 200? What this means and how can it be fixed?

The URLs mentioned on the report work properly.

  • Konstantinos Xenos
    • Rubber Duck Debugger

    Hi Mark !

    Your robots.txt had a "Disallow: / " rule in it. This means that crawlers wouldn't be able to index anything on the site.

    If you see your files you'll find a _robots.txt ( this is the old one ) and a fresh robots.txt that is properly set up for WordPress.

    I've also installed the SmartCrawl Pro version and run a new crawl so you'll see that the errors are gone now.

    Tell me if you need any further help!

    Regards,
    Konstantinos

Thank NAME, for their help.

Let NAME know exactly why they deserved these points.

Gift a custom amount of points.