I'm fairly certain that Google will reindex the same page, trying every single value from a drop down list if you have one as an exposed filter. I'm suggesting that
# URL Variables disallow: /*list disallow: /*of disallow: /*drop disallow: /*down disallow: /*url disallow: /*variables
be added to robots.txt, or at least a warning somewhere so web admins know how to handle the issue. Spiders might try other values in other url variables as well? Guessing on the robots.txt syntax, it's correct right?
Similar yet different issue