This really needed fixing and attention! Your original robots.txt file with drupal shields certain paths from search engines such as /admin, etc. BUT if you installed a multi lingo system in your drupal CMS, then all paths are starting with /en /es fr!!! This cases the robot being able to enter those forbitten paths and even DIRECTORIES such as files since in multi-lingo the file path is also /fr/files/images/ etc
The translation module programmers should add automatic adaption of the robots file... Yet for one of my sites it is already too late :( These paths are all now googled .... grgrgr!
How to fix it Manually:
Well, copy all the sections but the files section like cron.php (domain/en/cron.php does not work nor exists, that is why..) to duplicate for each added language
Node with NO "/" at the end since it is the node and not the /node/2 path as /node/ would block to much!
CA is for catalan: Check /ca to be with clean urls and with the unclean url /?q=ca/ instead
Disallow: /ca/database/
Disallow: /ca/includes/
Disallow: /ca/misc/
Disallow: /ca/modules/
Disallow: /ca/sites/
Disallow: /ca/themes/
Disallow: /ca/scripts/
Disallow: /ca/updates/
Disallow: /ca/profiles/
Disallow: /ca/files/
Disallow: /ca/node
# Paths (clean URLs) CA
Disallow: /ca/admin/
Disallow: /ca/aggregator/
Disallow: /ca/comment/reply/
Disallow: /ca/contact/