Googlebot IPs are just pounding my website since I upgraded to v 7.x-1.0-rc5 (and now rc6) and filling my watchdog with failed URL entries like this:
Type php
Date Thursday, July 18, 2013 - 20:39
User Anonymous (not verified)
Location http://mysite/biblioalias?f%5Bauthor%5D=101&f%5Btg%5D=T&f%5Bag%5D=B&s=ye...
Referrer
Message Notice: Undefined index: s in biblio_build_query() (line 433 of /home/myhome/public_html/d7/sites/all/modules/biblio/includes/biblio.pages.inc).
Severity notice
Hostname 66.249.73.53
Operations
I expect I could prevent Google from crawling this way with an entry in robots.txt, but what the heck is going on? It's like the Google crawl has just gone crazy. I estimate several million queries like this.
I'm wondering if anyone has seen something similar?
Thanks.
-- Cronin
Comments
Comment #1
rjerome commentedThat was a small error that slipped through with the new code yesterday, it's fixed now.
http://drupalcode.org/project/biblio.git/commit/16cb2b6
You should also add the following to your robots.txt file since the google bots are really just wasting time and resources trolling through all the various permutations and combinations of sorting and filtering.
This is in the README.TXT file...
Ron.
Comment #2
rjerome commentedComment #3
cvining commentedrjermone!
Yes indeed, fixed and fixed!
Like magic! And I notice the fix is in the latest (r7) release.
Nothing to see here, folks. Please move on...
Sincere thanks!
-- Cronin