I have installed the brand new amazons3-7.x-2.0 published last week.

I finally managed to have it works, and I have the CCK file field, with a typical remote #upload location URI : s3://[bucketname]/[subdirectory].

On the node form, the first file I upload works fine. It monitored the file_managed_file_save_upload($element) function, which is called with the proper argument (remote URI): $element['#upload_location']=s3://[bucketname]/[subdirectory] .

But when I click on "add new file" and select another file, the file_managed_file_save_upload function is called with a wrong #upload_location : $element[#upload_location']= s3://[subdirectory]

The upload then failed with the following message : "The file could not be uploaded.".

In order to get amazons3 module to work with this long awaited new release, I had to patch a lot of files with contributed patches fixing the exact same issue of bucketname in the URI.

Is somebody facing the same problem and found a solution ? Will plupload module fix the issue.

CommentFileSizeAuthor
#2 2498207-2-fix-multiple-upload.diff793 bytesszantog
Support from Acquia helps fund testing for Drupal Acquia logo

Comments

kha’s picture

ok, after spending a few hours debugging the code, I have finally identified the problem and come out with a possible solution :

Amazons3 implements a amazons3_field_widget_form_alter that fixes the $element['#upload_location'] :

function amazons3_field_widget_form_alter(&$element, &$form_state, $context) {
  $field = $context['field'];
  $instance = $context['instance'];
  $delta = $context['delta'];

  if (in_array($field['type'], amazons3_file_like_field())) {
    $element[$delta]['#upload_location'] = amazons3_field_widget_uri($field, $instance);
  }
}

The problem is that this function is only called once, on $delta=0. Thus, it only fixes the first delta instance of the field, correcting the first upload. But the second upload fails because the bucketname is not in the remote URI.

In order to fix this, I propose the wrote following function in my module :

function amazons3_field_widget_form_alter(&$element, &$form_state, $context) { 
$field = $context['field']; $instance = $context['instance']; $delta = $context['delta'];

if (in_array($field['type'], amazons3_file_like_field())) {
    foreach (array_keys($element) as $count => $key) {
        if (is_numeric($key) && isset($element[$key]['#upload_location']) && !(strpos($element[$key]['#upload_location'],'s3://')===FALSE)) {
            $element[$key]['#upload_location'] = amazons3_field_widget_uri($field, $instance);
        }
    }
}
return;
}

It works well, allowing multiple uploads in S3 with the 7.X-2.0 release of the library. Next step would be to post a patch to the amazons3 module.
Tell me if you want to make a patch out of it, since I am not familiar with the environment.
Kha

szantog’s picture

Solution seems good, I just made a small fix and orginize it as patch.

szantog’s picture

Version: 7.x-2.0 » 7.x-2.x-dev
Status: Active » Needs review
benjy’s picture

Status: Needs review » Closed (duplicate)