DISCLAIMER - I fully appreciate my issue may not actually be with Advagg!

I recently moved:
/sites/default/files -> S3
database -> RDS
Site files -> EC2

...making the EC2 a lightweight (cheap) and disposable instance. I was expecting a performance improvement from offloading resources from the server to S3 too, but actually took a significant hit. This was down to S3 not handling/serving compressed files without user interaction.

That got me (frustrated)/wondering - is there a way to configure advagg/s3fs/drupal to serve compressed content from local machine, and everything else from S3?

I'm sure someone must have overcome this limitation with AWS/S3 and Advagg, but I can't find anyone explaining how they did it.

Any help/guidance appreciated!

A.

Comments

andylarks created an issue. See original summary.

mikeytown2’s picture

Is this doc helpful? http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/Servin...

What does the status report say? admin/reports/status

andylarks’s picture

Hi Mikey

Thanks for responding. I'm not actually using Cloudfront...yet.

In the advagg_css and advagg_js folders (usually in sites/default/files) is an .htaccess file that I think is responsible for instructing Apache to serve compressed files. When you use s3fs (which may be more at fault here), you replace public:// with s3://. That means that the advagg_css/js folders move into S3 - which also means that the .htaccess file becomes redundant.

However, the .htaccess in the root directory is also telling apache to serve gz files if the browser copes, so I don't really understand why it's not doing so. The gz versions of the css/js files are in the same place as the uncompressed versions that are being served. So confused/frustrated right now!

Thanks again

A.

mikeytown2’s picture

S3 doesn't read .htaccess files at all.

AdvAgg will auto create .gz files. You need to change the metadata for each one saying it's compressed... http://www.rightbrainnetworks.com/blog/serving-compressed-gzipped-static... The comments below in this article seem to be helpful.

andylarks’s picture

@mikeytown2 - again, thank you for your help with this.

I've tried going through S3 and applying the metadata settings for each gz file I can find, but I'm still being served uncompressed css/js.

In addition, having to manually set the metadata on a per-file basis isn't a viable option for a busy site.

I'll confess I spread my bets a little and made a corresponding issue in the s3fs issue heap: https://www.drupal.org/node/2884196#comment-12118567

...where it was pointed out that you can opt for "Amazon Simple Storage Service", which leaves css/js intact on the local server. As these are recreated by the site, they are disposable, which leads to the desired result.

I still feel like it would be a more elegant solution to have everything coming out of S3, but also beginning to concede that this may not be possible......

A.

mikeytown2’s picture

Any update on this?

mikeytown2’s picture

Status: Active » Fixed

Marking as fixed due to lack of feedback.

Abhishek Jain’s picture

Hi,
Any updates/resolution on this issue?
We are still facing this same issue using AWS S3. It seems its more of an S3 issue.
Our gzip file is getting generated in S3, but unable to serve gzip css and js.

mikeytown2’s picture

This seems like a S3 issue. If using cloudfront it is not an issue.

Status: Fixed » Closed (fixed)

Automatically closed - issue fixed for 2 weeks with no activity.