[SMARTCRAWL] robots.txt

I see this message in SmartCrawl:


Good news! We located the robots.txt file and search engine crawlers have access to do their job. However, we haven't been able to find any sitemap information in it.
More Info

The robots.txt file is used to tell search engine bots and crawlers what they are allowed to crawl on your site.

During the chat we added this code to robots.txt file
Sitemap: https://*****.com/sitemap.xml
but issue still exist

  • Predrag Dubajic

    Hi Jerrad,

    Hope you're doing well.

    I was doing some tests on your site and noticed that when I edit the robots.txt file I'm unable to see the changes in browser, even if I try with different browser and incognito mode, and the only way I could see the change was to enable VPN and visit your site from a "different" location.

    So I'm guessing that SmartCrawl is still seeing the old file and that's why it shows the report.

    I don't see any caching enabled on the site but it seems like this is the cause, can you get in touch with your hosting provider and see if there's any other server-side caching running and if they can disable/clear it and then give it another scan in SmartCrawl?

    Best regards,

Thank NAME, for their help.

Let NAME know exactly why they deserved these points.

Gift a custom amount of points.