I was watching my search hits and noticed that google was indexing my send pages. That seemed like a total mistake, so I drew on what was done for the print.module (simplified though), and applied it here:

--- send.module 2007-02-12 14:32:11.000000000 -0500
+++ send.module.new 2007-04-13 12:38:49.000000000 -0400
@@ -213,6 +213,10 @@
return;

case 'body': // callback that modifies appearance of rendered body
+ global $robots_string; if (!isset($robots_string)) {
+ drupal_set_html_head('');
+ $robots_string = true;
+ }
return;

case 'querystring': // callback that will return name=>value array for return links

I don't know if the options available to the print friendly module are really necessary (here or there). Not sure how many people want the search engines to access redundant content. Seems to me that this would be a mistake.

Mike

Comments

Allie Micka’s picture

Do you mean it's indexing all of the send-to-friend forms?

Is there any reason we couldn't just add

drupal_set_html_head('<meta name="robots" content="noindex" />')

Someplace in the _send_form() code?

Allie Micka’s picture

Status: Needs review » Fixed

Should be fixed in HEAD and available in the next release:

* Send links include rel=nofollow
* The send page itself includes an HTTP header that reads X-Robots-Tag: noindex

This should keep all the well-meaning crawlers away!

mgifford’s picture

Thanks!

Status: Fixed » Closed (fixed)

Automatically closed -- issue fixed for 2 weeks with no activity.