SEO sitemap and checkup issues

SmartCrawl SEO sitemap for each subsite, and SEO Checkup scan for the main site isn't working.

1) SEO Checkup for the main site isn't working. The Scan isn't going past 0-1%, don't see any specific errors in the debug logs.

2) SmartCrawl Sitemaps are giving blank pages, when accessed. On giving a further, look disabling the "
Include Stylesheet With Sitemap" option in SmartCrawl, seems to resolve the issue in one of the subsites. However, could notice anomalies in other subsite sitemap though.

Another subsites sitemap is loading the following error:
No sitemap found. Maybe it's being generated. Please try again later.

  • Kris Tomczyk

    Hi richard

    Hope you are doing good today.

    1) SEO Checkup for the main site isn't working. The Scan isn't going past 0-1%, don't see any specific errors in the debug logs.
    as you mentioned during the chat session your host will not let you increase max_execution_time and memory_limit is already at 512M.
    From what I check on your site your memory_limit is still 256M and max_execution_time is very low (60). This is very low value, not only for SmartCrawl it self but also for mutlisite.
    Please consider to ask host again about this. With such low value Crawl will have issue all the time to finish the job.

    2. ....No sitemap found. Maybe it's being generated. Please try again later.
    I check all subistes via FTP and I see all files generate correct. For ex.:
    /wp-content/uploads/sites/33/ folder include sitemap.xml file but for some reason it cannot be display in browser. One of subistes generate this sitemap correct.

    I'm flagging this issue for our SLS Team (code experts) so that they can dig into this issue further for you. We will post an update here as soon as more information is available.

    Please keep in mind that our SLS Team deals with more complicated issues, thus it may take a little longer for them to reply here. Thank you for your patience while we look into this further.

    Kind Regards,
    Kris

  • richard

    I have requested that the execution time and memory limits be increased again. I will let you know what happens.

    The '/wp-content/uploads/sites/33/' sitmap location could have been generated by my previous SEO plugin. The main site generates sitemap okay and views in browser just fine. TherapyRoom.Rent generates sitemap okay but only able to view in the browser if the wpmudev formatting option is not selected. Not sure about comfortshieldstherapy.com as I am not able to view it in the browser at all.

    I get timeouts on both the Sitemap Crawl and the SEO Checkup.

  • richard

    This is the reply I got from Pagely. They really do not want to do this. Can this be fixed without increasing the limits?

    I assume you are referring to this SEO checkup process.
    https://therapyroomfinder.boxtreeclinic.com/wp-admin/admin.php?page=wds_checkup

    Just to reiterate Brian's important point before we would make any changes should you insist...

    The memory limit is 512 by default. I would not suggest increasing the max execution time for something like that. This would allow the server to become unstable and nothing should take more than 60 seconds which is the default timeout on a production server. You can't just increase the timeout for your plugin it would happen to all requests being executed.

    Is this to be a once off scan of the site or a continual regular process?

    I am unaware of any customers requiring a similar setting. Given that this is a requirement for a single plugin have you considered alternative SEO plugins ? Yeost seems to be the most popular.

    I would strongly recommend considering an alternative plugin rather than to changing memory_limit and timeout settings.

    Please share you thoughts and we can progress accordingly.

  • richard

    Here is another reply from Pagely. They are suggesting the use of a cron job - their reasoning seems right to me. Richard

    "Just a follow up. Running the SEO checkup took about 20 minutes to complete. This is not something adjusting the max execution time to 150 would solve considering this would require a setting of over 1200. Also the memory limit is most likely not showing correctly but we can adjust that to avoid warnings.

    The process that the SEO plugin is doing should really be adjusted so that it does not require execution that long. If something over 60 seconds is required it would be wise to consider a cron that is running in the background depending on the process they use. Please consider that requests made to a production server should be quick. There are only so many workers processes that can operate while consuming one for a long scan does not seem bad it would allow poor performing requests to lock up the remaining workers. There is no way in our platform to change the max_execution_timeout for just the one request that they use to handle the scan.

    There are many methods that could be used that don't hit timeouts but it would be hard for me to suggest which one you would want to use as I am not a developer.

    Please let us know if you want to consider running a cron for this task."

  • Panos

    Hi there richard ,

    Regarding the Sitemap you need to deactivate the Jetapack sitemap option from
    site.com/wp-admin/admin.php?page=jetpack#/traffic
    at the Sitemaps section
    also deactivate the Include Stylesheet With Sitemap option from SmartCrawl:
    site.com/wp-admin/admin.php?page=wds_sitemap
    at the Advanced tab

    As for the Crawl timing out, we're still looking into that. I have consulted the dev about this and we'll keep you updated.

    Thanks for all your feedback :slight_smile:

    Kind regards!

  • richard

    Hello - do you have an update on this ticket please? Here is the latest input from my host - Pagely. Richard

    "The cron option would require the plugin author to define something that we can run in cron. We also need to find a reasonable length of time to wait before running it again. It appears to currently take 20 minutes. Does this need to be run often? Is there some scripting that we can use in a cron to do the scan? If not we might be able to mimic the request in the dashboard but that is not as accurate if the plugin updates."

Thank NAME, for their help.

Let NAME know exactly why they deserved these points.

Gift a custom amount of points.