# Summary

S3 File System (s3fs) provides an additional file system to your drupal site, which stores files in Amazon's Simple Storage Service (S3) or any other S3-compatible storage service. You can set your site to use S3 File System as the default, or use it only for individual fields. This functionality is designed for sites which are load-balanced across multiple servers, as the mechanism used by Drupal's default file systems is not viable under such a configuration.

# Project URL

https://www.drupal.org/project/s3fs

# Where is the code?

On d.o.

# Estimated completion date

# Dependencies

# Who's doing the port?

# What help do they need?

# D8 roadmap

https://www.drupal.org/project/issues/s3fs?version=8.x

# Background and reference information

Comments

jhedstrom created an issue. See original summary.

jhedstrom’s picture

Priority: Minor » Normal
Status: Active » Needs work
jonathanshaw’s picture

For some simpler use cases the Flysystem module (D8 ready) may provide adequate integration with Amazon S3.

satish.p146’s picture

I'm trying to use this module in Drupal 8, After successful installation and configuring the credentials, i have following fatal error thrown

Fatal error: Call to a member function getObjectUrl() on null in \modules\contrib\s3fs\src\StreamWrapper\S3fsStream.php on line 474

HbtTundar’s picture

Title: [s3fs] S3 File System » S3 File System drush s3fs-copy-local and also copy local public file to s3 in Admin Config form in action tab not working
Category: Plan » Bug report
Priority: Normal » Major
Status: Needs work » Active
Issue tags: +can not copy files with s3 file sytsm to aws

I config s3 using settings.php like this :
/**
* Config AWS SDK
*/
$config['awssdk.configuration']['aws_key'] = getenv('S3ACCESSKEY');
$config['awssdk.configuration']['aws_secret'] = getenv('S3SECRETKEY');
/**
* Config S3 File System
*/
$settings['s3fs.access_key'] = getenv('S3ACCESSKEY');
$settings['s3fs.secret_key'] = getenv('S3SECRETKEY');
$config['s3fs.settings']['bucket'] = getenv('S3BUCKETNAME');
$config['s3fs.settings']['region'] = getenv('S3REGIONNAME');
$config['s3fs.settings']['use_https'] = TRUE;
$settings['s3fs.use_s3_for_public'] = TRUE;
$settings['s3fs.use_s3_for_private'] = TRUE;
$settings['s3fs.no_rewrite_cssjs'] = FALSE;
$config['s3fs.settings']['encryption'] = 'AES256';
$config['s3fs.settings']['public_folder'] = getenv('PUBLIC_FOLDER');
$config['s3fs.settings']['private_folder'] = getenv('PRIVATE_FOLDER');
$settings['s3fs.use_cname'] = TRUE;
$settings['s3fs.domain'] = $_SERVER['HTTP_HOST'];
$config['s3fs.settings']['root_folder'] = 'XXXXX';
then I use validate in Actions Tab of Config Files and it's validate with out any problem but when i try to copy file using button in Actions tab no files copy and I redirect to Home page with this:
Notice: unserialize(): Error at offset 19629 of 43474 bytes in Drupal\Core\Batch\BatchStorage->load() (line 71 of
core/lib/Drupal/Core/Batch/BatchStorage.php).
No active batch.
also when i try to run drush s3fs-copy-local i face with this :
[warning] You should have read "Copy local files to S3" section in README.txt.
[warning] This command only is useful if you have or you are going to have enabled s3 for public/private in your setting.php

Are you sure? (yes/no) [yes]:
> yes

[error] Your AWS credentials have not been properly configured.
Please set them on the S3 File System admin/config/media/s3fs page or
set $settings['s3fs.access_key'] and $settings['s3fs.secret_key'] in settings.php.
[error] Your AmazonS3 bucket name is not configured.
[error] An unexpected error occurred. Error executing "PutObject" on "https://s3fs-tests-results/write-test-281019-1721.txt"; AWS HTTP error: cURL error 6: Could not resolve host: s3fs-tests-results (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)
[error] It couldn't upload file with public ACL, if you use S3 configuration to access all files through CNAME, enable upload_as_private in your settings.php $settings['s3fs.upload_as_private'] = TRUE;
[error] An unexpected error occurred. Error executing "PutObject" on "https://s3fs-tests-results/public-write-test-281019-1721.txt"; AWS HTTP error: cURL error 6: Could not resolve host: s3fs-tests-results (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)

In S3fsCommands.php line 101:

mmjvb’s picture

Title: S3 File System drush s3fs-copy-local and also copy local public file to s3 in Admin Config form in action tab not working » [s3fs] S3 File System
Category: Bug report » Plan
Priority: Major » Normal
Status: Active » Needs work
Issue tags: -can not copy files with s3 file sytsm to aws

@HbtTundar Please use the issue queue of the project itself. This issue is about the port to D8, not for support requests.

apaderno’s picture

Status: Needs work » Needs review

Since there is an alpha release, the status is now Needs review.

apaderno’s picture

Status: Needs review » Fixed

It has a stable release since November, 2022.

Status: Fixed » Closed (fixed)

Automatically closed - issue fixed for 2 weeks with no activity.