Closed (won't fix)
Project:
XML sitemap
Version:
7.x-2.0-beta2
Component:
xmlsitemap.module
Priority:
Normal
Category:
Bug report
Assigned:
Unassigned
Reporter:
Created:
20 Apr 2011 at 07:23 UTC
Updated:
29 Nov 2011 at 00:26 UTC
Jump to comment: Most recent
Comments
Comment #1
Ravi.J commentedAlthough a month old, For the benefit of others that have similar problems, here is what i found.
When strongarm sets variable cache in hook_init. XMLSitemap somehow manages to use the variables before hook_init is called and as a result always returns Null.
There is a patch logged in strongarm issue queue that solves this issue for me #1062452: strongarm_set_conf() needs to be called sooner.
Comment #2
egarias commentedI am not using strongarm and I have the same issue.
http 403 Denied access
The sitemap stops at page 32, i am unable to update my sitemap, and I don't know were to look.
if there's some way i could help debugging i will be very happy, I am spending a lot of hours in that and it is a very important issue for me. My site is http://empresa.artic-group.net
many thanks
Comment #3
Anonymous (not verified) commented@egarias: I just viewed page=74 on your site. I'm not understanding what is wrong.
Comment #4
egarias commentedok
I ended up restarting and restarting via the admin interface to complete all the pages.
You saw the page=74 because I did modify the code
and having a good sitemap in the xmlsitemapreal so google will allways find it while i am rebuilding or having any error.
I think this maneuvre could be improved and integrated, what do you think?.
Another idea in my mind is the posibility to regenerate in several batches so cron will regenerate each batch interval and at the end, the last cron will finish everything and then copying the new sitemap to the "something like" the "sitemapreal" i just invented.
Thanks
Comment #5
egarias commentedAfter some manual rebuilds with lots of restarting and time spent i build my own sitemap process.
It was easy because my highest priority was having an up to date sitemap with the new nodes and allways available sitemap for crawlers. So I wrote a small script.
Comment #6
dave reid