[SmartCrawl Pro] Smartcrawl – Scan timed out and was unable to complete.

I am gettting a time out message when running smartcrawl – Scan timed out and was unable to complete.

I have added max exuction time of 3000 to both wp-config.php and .htaccess files.

  • Nithin
    • Support Wizard

    Hi Geoff Trebilco,

    Sorry to hear the Scan isn’t working as expected. I checked the header response of your website and it looks fine. We would like to troubleshoot further via your website dashboard and see what’s causing this.

    Could you please grant support staff access so that we could give a closer look. You can grant access from WPMU DEV > Support > Support Access > Grant Access, or check this manual: https://premium.wpmudev.org/docs/getting-started/getting-support/#chapter-5

    Please let us know once you enable access so that we could get this sorted. Have a nice day.

    Kind Regards,


  • Kris
    • Support

    Hi Geoff Trebilco

    Thank you for support access to site. I was able to replicate this issue and scan stuck as 99%.

    Could you please send below data through our secure contact form here https://premium.wpmudev.org/contact/#i-have-a-different-question, so we could debug this more and check is there any error, and make sure that subject is “I have a different question” and:

    – Mark to my attention: ATTN: Kris Tomczyk

    – Site access (login url / username / password)

    – FTP access (host / username / password / port)

    – Link back to this thread

    Please don’t share any sensitive information (i.e credentials) in the Support Forum, it has public visibility and everyone will have access to it.

    Please confirm here in thread that you have sent that message.

    Also could you whitelist below WPMU IP’s on your host side if possible:

    Kind Regards,


  • Kris
    • Support

    Hi Geoff Trebilco

    Thank you for all data. I made some additional review and tests which generate some errors on SmartCrawl logs. I pass that info to SLS team and escalate this issue to them. We will post an update here as soon as more information is available.

    Please keep in mind that our SLS Team deals with more complicated issues, so it may take a little longer for them to reply here. Thank you for your patience while we look into this further.

    Kind Regards,


  • Panos
    • SLS

    Hi again!

    Still working on it, one issue seems to be that the “Gutenberg Blocks – Ultimate Addons for Gutenberg” has an invalid id leading to a markup problem. Still the crawl is not completing after fixing that so I’m waiting for a new check on logs from the api devs.

    In case you are interested, the file I changed for the above issue is :
    where I replaced :

    			<script type="text/javascript" id="
    			++  <?php echo $key; ?>">


    <script type="text/javascript" id="<?php echo $key; ?>">

    You can contact that plugin’s support regarding that file so they could double check :slight_smile:

    Will keep you posted once I have news.

    Kind regards!

  • Panos
    • SLS

    The one mentioned above was the issue after all. It didn’t get fixed immediately after that change, as the crawler still used cached pages. Good to know it is working on your side too! I would just like to repeat here that you can mention the above case to the plugin developer/support so they can have a look!

    Kind regards!

  • Geoff Trebilco
    • Flash Drive

    I did pass it on to Brainstorm Force and there response was:

    Thank you for reaching us out. We appreciate your efforts to report this to us.
    Our team has fixed this issue in our development version. Very soon we will release an update with this.

    Vrunda Kansara”

    Which is good as I was concerned that an update to the plugin would overwrite your fix.

    Should I now remove the plugin Smartcrawl API Tester?

    Thanks for your support.

Thank NAME, for their help.

Let NAME know exactly why they deserved these points.

Gift a custom amount of points.