I've exported a few XML sitemap setting via Features, and after enabling the module and the features on the production site, I'm getting access denied on node#overlay=admin/config/search/xmlsitemap/rebuild

Any ideas?

The following variables were exported through features:

xmlsitemap_base_url
xmlsitemap_frontpage_changefreq
xmlsitemap_frontpage_priority
xmlsitemap_settings_node_page
xmlsitemap_settings_node_webform

Comments

Ravi.J’s picture

Although a month old, For the benefit of others that have similar problems, here is what i found.
When strongarm sets variable cache in hook_init. XMLSitemap somehow manages to use the variables before hook_init is called and as a result always returns Null.

There is a patch logged in strongarm issue queue that solves this issue for me #1062452: strongarm_set_conf() needs to be called sooner.

egarias’s picture

I am not using strongarm and I have the same issue.
http 403 Denied access
The sitemap stops at page 32, i am unable to update my sitemap, and I don't know were to look.
if there's some way i could help debugging i will be very happy, I am spending a lot of hours in that and it is a very important issue for me. My site is http://empresa.artic-group.net

many thanks

Anonymous’s picture

@egarias: I just viewed page=74 on your site. I'm not understanding what is wrong.

egarias’s picture

ok
I ended up restarting and restarting via the admin interface to complete all the pages.
You saw the page=74 because I did modify the code

at: function xmlsitemap_get_current_chunk(stdClass $sitemap) xmlsitemap.pages.inc
  $file = xmlsitemap_sitemap_get_file($sitemap, $chunk);
+  $file = str_replace('xmlsitemap','xmlsitemapreal',$file);

and having a good sitemap in the xmlsitemapreal so google will allways find it while i am rebuilding or having any error.

I think this maneuvre could be improved and integrated, what do you think?.

Another idea in my mind is the posibility to regenerate in several batches so cron will regenerate each batch interval and at the end, the last cron will finish everything and then copying the new sitemap to the "something like" the "sitemapreal" i just invented.

Thanks

egarias’s picture

After some manual rebuilds with lots of restarting and time spent i build my own sitemap process.
It was easy because my highest priority was having an up to date sitemap with the new nodes and allways available sitemap for crawlers. So I wrote a small script.

dave reid’s picture

Status: Active » Closed (won't fix)