This is a follow on from #100516 - CSS preprocessor (and, originally #81835), which is a patch to aggregate multiple CSS files into a single (cached) file.

This patch (which should be applied on top of the #100516 patch):

  • Adds an option to the settings to gzip this cached file.
  • If enabled then it gzips and saves a .css.gz file in addition to the regular .css file and...
  • Adds a .htaccess rule to use this file if the browser accepts gzip and the gz file exists.

As you can see, it is a very simple addition, but should be very valuable for heavy sites. Apart from the time saved for users, there is a 16Kb bandwidth saving too - potentially several GB of bandwidth a year for a site with several 100K visitors.

I currently have this flagged as a feature request for 5.0, but it could very well be considered a usability enhancement.

Here are my original benchmarks:
-----------------------------------------

          HEAD
          Total       Transfer Duration       Page Duration
Test 1    12798       8756                    8454
Test 2    12984       8907                    8162
Test 3    12730       8718                    7966
Average   12837       8794                    8194
Baseline  100%        100%                    100%
                     
          conditional_css_include_2.patch
          Total       Transfer Duration       Page Duration
Test 1    10633       7111                    6828
Test 2    11225       7332                    7530
Test 3    11047       7639                    7357
Average   10968.33    7361                    7238
Faster By 15%         16%                     12%
                     
          cache_19.patch
          Total       Transfer Duration       Page Duration
Test 1    9160        5453                    5175
Test 2    9417        5712                    4961
Test 3    9056        5306                    5290
Average   9211        5490                    5142
Faster By 28%         38%                     37%

          conditional_css_include_2.patch AND cache_19.patch
          Total       Transfer Duration       Page Duration
Test 1    9595        5615                    5333
Test 2    9909        5709                    5425
Test 3    9461        5824                    5537
Average   9655        5716                    5432
Faster By 25%         35%                     34%

          cache_19.patch AND gzip
          Total       Transfer Duration       Page Duration
Test 1    7429        3495                    2741
Test 2    7027        3306                    2758
Test 3    6915        3267                    3001
Average   7124        3356                    2834
Faster By 45%         62%                     65%

          conditional_css_include_2.patch AND cache_19.patch AND gzip
          Total       Transfer Duration       Page Duration
Test 1    7489        3157                    2395
Test 2    6699        3107                    2364
Test 3    7250        3013                    2255
Average   7146        3092                    2338
Faster By 44%         65%                     71%

* Testing was done locally, to eliminate the variable latency you get on live networks. The Charles web debugging proxy was used to apply a consistent throttle and latency equivalent to a typical 64Kbps connection.
* Browser caching was disabled completely, to simulate an initial page load. I can repeat, with caching enabled (to test http cache freshness checks) if that would be useful.
* Drupal page caching (css caching for #100516) was enabled, and the test was done as an anonymous user. All Drupal modules were enabled (a few of these would more likely be contrib modules in reality). The page cache was cleared before each set of tests, and 2 dummy reloads were made before starting timing.
* The first column 'Total' is the total time spent transferring data, as reported by Charles. This does not include the time saved by browser pipelining of http requests.
* The second column 'Transfer Duration' is the Duration between the first byte transfer and the last byte transfer, as reported by Charles
* The third column 'Page Duration' is the time between the end user hitting refresh and Firefox finishing building the page. This is occasionally less that the transfer duration, which is a little odd, but perhaps certain page graphics (favicons maybe?) are not included in the page build time.

Note that these percentages are compared to the times in my original HEAD benchmark (with no patches) at http://drupal.org/node/100516#comment-162025
Please read the notes on that comment if you have not done so already. Also note that all the conditional css includes patch is doing here is somewhat reducing the cached css size, because none of the conditions are met on the front page.

Here is a summary:

Test Set                     Faster By:  Seconds Saved:
HEAD                         100%        0
conditional                  12%         1
caching                      38%         3
conditional + caching        34%         3
caching + gzip               65%         5.3
conditional + caching + gzip 71%         5.8

Looking at this I am actually wondering if we should be looking at including gzipped css (and js) in core - if not for 5.0 - then certainly in 6.0. While the percentages are skewed by the fact that my test page is pretty lightweight (i.e. not much content, and no user added images) the seconds saved are real, actual seconds that would be saved by a user on a 64Kbps connection, and would be saved no matter what is added to the site and theme in the way of content and additional images. Also note that the conditional includes patch has now been committed to HEAD, but wasn't when I did my initial benchmarks - I am keeping it separate here for easier comparison.

Bearing in mind the statistic that most users will only wait 4 seconds before going to a different site, the application of aggregation/caching and gzip can take the initial page load time from a very poor 7 or 8 seconds, to a very respectful 2 seconds. The difference between these patches is extremely noticeable, the site goes from feeling pretty sluggish to appearing extremely fast.

Support from Acquia helps fund testing for Drupal Acquia logo

Comments

lennart’s picture

Personally, I think this is critical. Initial page load time is a critical factor for first time visitors. This one almost cuts the initial load time in half. Very significant!

Owen Barton’s picture

I just noticed a typo in the benchmark tables - 'Faster By' for HEAD should of course be 0% (or N/A!), not 100%.

moshe weitzman’s picture

Status: Needs review » Needs work

we have outstanding bugs on our gzipped page cache (if first person to view a page has gzip disabled, we still cache that page and send wrong headers). i'd rather see that resolved before we add more gzip.

the implementation here is a bit more interesting since the logic is in .htaccess. is there no performance penalty for this .htaccess check on every request?

also, don't we have to tell the browser that we are sending gzip in the response headers? are we expecting browser to know because filename ends in .gz?

the #description doesn't explain anything about gzip

m3avrck’s picture

Version: 5.x-dev » 6.x-dev

Subscribing... this should handle CSS and JS files...

Owen Barton’s picture

There is a different (better, I think) approach in my sandbox: http://cvs.drupal.org/viewcvs/drupal/contributions/sandbox/grugnog/gzip_... - which should deal with css, js and a few other things too.

I think moshe was right about the headers (in terms of following specs), so the .htaccess option is probably out (although it did work cross-browser).

m3avrck’s picture

With 4 weeks to go, let's see if we can get this in :-)

Owen, did you want to start with a patch based off of your module? If not, I can try and post one in the next few days.

catch’s picture

Version: 6.x-dev » 7.x-dev
Wim Leers’s picture

Subscribing.

Wim Leers’s picture

Title: Gzip aggregated CSS to cut 2 seconds off initial load » Gzip aggregated CSS and JS

Since JS aggregation is now in Drupal 6 core as well, I suggest we tackle both CSS and JS aggregation here.

dharamgollapudi’s picture

subscribing...

Leeteq’s picture

Subscribing.

BioALIEN’s picture

This had huge potential but unfortunately didn't make it into 6.x. What's holding up the progress on this patch? Is it because of the outstanding bugs on gzipped page cache in core?

Also I agree with Grugnog2 in #5 about the use of .htaccess method, while it is browser compatible, Dries highlighted his desire to make core compatible with lighttpd and other web servers. We should put some consideration for those too if possible.

oNyx’s picture

On my site a typical page is around 100kb in size. About 75kb of that are aggregated js (50kb) and css (25kb) files, which can be compressed down to about 25kb (with stripped comments 17kb).

Even without stripping comments it would cut it in half for first time visitors. And you can get a whole torrent of those from social bookmarking sites. My puny little page gets about 3000 of those occasionally and I'm currently on some cheap shared host. So, it generates about 300mb instead of 150mb. That's a pretty big difference if you ask me.

Since the number of visitors is constantly rising I really would love to see that kind of thing in core ASAP. Otherwise I'll need to switch to some more expensive hosting plan a few months sooner than actually necessary.

alanburke’s picture

subscribing

chirale’s picture

This topic is pretty hot nowadays, I read two related discussions on contrib modules for 5.x: this one minify the Javascript aggregated file, and this is about "statical" gzipping method. Today I start to use both the approaches combined, and the result is fine. Since aggregation is a core task, I leave this to a small bash script run on cron job. Should JSMin patch to Javascript Aggregator be reused here?

oNyx’s picture

Got tired of waiting and just added it to my installation. Only requires 14 extra lines in .htaccess and 2 in common.inc. These 2 lines need to be added again after updating, but imo it's totally worth the hassle.

The step by step guide is over here:
How to GZip Drupal 6.x's aggregated CSS and JS files

Susurrus’s picture

It seems that Zlib support isn't standard for a PHP installation, at least not on non-Windows OSes. Can someone confirm this?

If this is the case, it might stand in the way of adding this.

oNyx’s picture

Surprisingly you're correct. It's silly, but there are indeed a few installations without Zlib support.

If it's implemented as an option, it would be a good idea to guard the setting (defaults to off) with a function_exists('gzencode') condition.

Fortunately the level parameter for gzencode exists since 4.2 and Drupal requires 4.3.5+. So, at least that part won't be an issue.

Susurrus’s picture

Well, Drupal 7 requires 5.2+, so that's definitely not an issue. I would think this would need to be implemented like Clean URLs, where the option to enable gzip encoding would be disabled if the function doesn't exist with an explanation of why.

How about a reroll as an option in the Performance settings page?

lilou’s picture

subscribe.

AltaVida’s picture

subscribe

Owen Barton’s picture

Status: Needs work » Needs review
FileSize
3.12 KB

Here is a patch, based somewhat on the original patch I posted, but with improvements based on http://kaioa.com/node/78 and http://drupal.org/node/290280

Here is how this should work
* It should fail safe to the regular uncompressed aggregated CSS or JS in almost every situation. This includes gzencode not being available (e.g. on very cut down windows PHP builds), if mod_rewrite is not available and enabled, if the gz file is not found or the browser does not accept gzip encoded data.
* It sends both the correct file extension (i.e. we don't ask the browser to request a .gz file, but just rewrite the request behind the scenes) and also the correct MIME type for the file.
* It does not involve any kind of PHP bootstrap at all - everything is done at the Apache level. In addition, the .htaccess files are created dynamically within the files/X directories, meaning that we don't invoke additional rewrite rule tests (a concern of Moshe in #3) or clutter up an already complex /.htaccess.
* There is no additional user interface cruft. We don't have options for core page level gzipping and I don't see any need for them here, as long as we give it sufficient testing cross-platform. This is a no-brainer.

I have tested this on my system and it worked very well, on the first run in fact (the general approach is also quite battle hardened on many production sites).

If you want to test this out you can do this by applying the patch and then checking the response headers using the "Net" tab in Firebug or a similar tool for both .css and .js files. You should see "Content-Encoding gzip" in there and also notice a noticeable drop in file size.

If someone would like to write some tests to polish this off that would be a valuable contribution.

RobLoach’s picture

Status: Needs review » Needs work

Is there any reason why we're checking function_exists with gzencode? Drupal 7 requires at least PHP 5.2, so wouldn't it be safe to assume it's available?

As for tests, I'd love those to go in with this patch, but won't have the chance to hit it up until next week because I'll be off on vacation. Otherwise, great work! Also, does that clear cache functionality remove the respective .htaccess file?

Owen Barton’s picture

@Rob Loach

We are testing function_exists with gzencode for the same reason we do this with the core page gzipping code. Zlib support for PHP is extremely common, but is actually not enabled in a default vanilla compile of PHP (even in 5.2), so it is quite possible that it will not be available - "Zlib support in PHP is not enabled by default. You will need to configure PHP --with-zlib[=DIR]" from http://www.php.net/manual/en/zlib.installation.php

If you have a chance to do some tests that would be awesome!

The cache clearing functions do not remove the .htaccess file - all dot files are excluded. This shouldn't matter either way, since the file would be recreated anyway and is harmless if no .gz files are present. Possibly we might want to add a line to delete the .htaccess file at the point when aggregation is disabled (although it causes no issue if it is there), or perhaps even the js/css directory in it's entirety (on the basis that it is cruft and could be confusing to a newbie).

catch’s picture

We do have a UI for core page gzipping - see 'page compression' at admin/settings/performance - and it's a useful thing to have if you're using mod_deflate/mod_gzip etc. We could reword the interface text there and use the same variable for both things though - seems like it'd be the same reason for switching one off as switching off the other.

janusman’s picture

subscribing

kenorb’s picture

I'm not quite understand, why Drupal don't gziping all other pages, even browser have support for gip (I'm not talking about js and css).
I've tried to found it, but without result. Somebody know some link to the topic?

RobLoach’s picture

I think this should be solely be managed through contrib once #352951: Make JS & CSS Preprocessing Pluggable is in.

btully’s picture

subscribe

mikeytown2’s picture

Update: Contributed modules can now do this for 6.x. Once the code freeze is in place (early September), porting these to 7.x shouldn't be that hard.
http://drupal.org/project/javascript_aggregator
http://drupal.org/project/css_gzip

andypost’s picture

Both modules exclude safari (webkit) so first projects from #30 should be fixed

Safari (webkit) bug https://bugs.webkit.org/show_bug.cgi?id=9521

So there 2 solutions:

1) make filename.css with filename.css.gz.css

2) alternative way is store filename.css (gzipped) and plain filename.nogzip.css

Both work with safari 3 and google chrome

<IfModule mod_rewrite.c>
    RewriteEngine On
# Konqueror and "old" browsers
    RewriteCond %{HTTP:Accept-encoding} !gzip [OR]
    RewriteCond %{HTTP_USER_AGENT} Konqueror
    RewriteRule ^(.*)\.(css|js)$ $1.nogzip.$2 [QSA,L]
</IfModule>			    

<IfModule mod_headers.c>
    Header append Vary User-Agent
# for all  css/js files setup Content-Encoding
    <FilesMatch .*\.(js|css)$>
	Header set Content-Encoding: gzip
	Header set Cache-control: private
    </FilesMatch>
# reset Content-Encoding if not archive
    <FilesMatch .*\.nogzip\.(js|css)$>
	Header unset Content-Encoding
    </FilesMatch>
</IfModule>
lilou’s picture

Issue tags: +Performance

Add tag.

mikeytown2’s picture

andypost’s picture

mikeytown2’s picture

Issue tags: -Needs backport to D6

Been thinking about the safari/webkit bug... because of the rewrite rules, the browser only sees it as *.css where on the file system it is *.css.gz. Changing the extension around would have no effect; in short it's not a bug because the html file doesn't reference *.css.gz it still points to *.css (bug report deals with *.css.gz in the html code). With that in mind this is a fairly straight forward patch for D7, the question is will it be accepted if I write one?

There are 3 functional changes that need to take place
http://api.drupal.org/api/function/drupal_build_css_cache/7 - write *.css.gz files
http://api.drupal.org/api/function/drupal_build_js_cache/7 - write *.js.gz files
http://api.drupal.org/api/function/system_performance_settings/7 - add gzip compression settings

Add this to .htaccess, inside <IfModule mod_rewrite.c> right above # Rewrite URLs of the form 'x' to the form 'index.php?q=x'.

  <FilesMatch "\.(css.gz)$">
    AddEncoding x-gzip .gz
    ForceType text/css
  </FilesMatch>
  RewriteCond %{HTTP:Accept-encoding} gzip
  RewriteCond %{REQUEST_FILENAME}.gz -f
  RewriteRule ^(.*)\.css $1.css.gz [L,QSA]
  
  <FilesMatch "\.(js.gz)$">
    AddEncoding x-gzip .gz
    ForceType text/javascript
  </FilesMatch>
  RewriteCond %{HTTP:Accept-encoding} gzip
  RewriteCond %{REQUEST_FILENAME}.gz -f
  RewriteRule ^(.*)\.js $1.js.gz [L,QSA]
chx’s picture

khm.

 AddOutputFilterByType DEFLATE text/css application/x-javascript text/html text/plain text/xml
mikeytown2’s picture

@chx, that compresses on the fly for every request; slightly slower. Don't forget about level 9; default is 6 if I remember correctly.

AddOutputFilterByType DEFLATE text/css application/x-javascript text/html text/plain text/xml
DeflateCompressionLevel 9

Core caches gzip html, so it's not a way out there request; plus having settings in the UI is a nice usability feature. It's something most people want IMHO. Great point BTW.

hass’s picture

#35 doesn't work in IIS.

mikeytown2’s picture

how does IIS support conditional gzip, so only the clients that support gzip get gzipped content?

andypost’s picture

@hass IIS rewrite rules are possible different

hass’s picture

This is not possible in IIS (for e.g. with Helicon ISAPI_Rewrite, and not at all without payed plugins) and therefore no general solution:

<FilesMatch "\.(css.gz)$">
    AddEncoding x-gzip .gz
    ForceType text/css
</FilesMatch>
hass’s picture

@mikeytown2: IIS does not have any GZIP compression feature build in for dynamic files (only for static HTML files). You need to buy extra plugins to archive this - if you think you really need it...

mikeytown2’s picture

Shrunk gzip .htaccess rules. Escaped all periods . for possible performance increase with pattern matching since . means any character. Anyone wants to test the escaped period hypothesis out?

Option 1

  <FilesMatch "\.gz$">
    AddEncoding x-gzip \.gz
  </FilesMatch>
  #skip gzipped css/js files if browser doesn't accept gzip encoding
  RewriteCond %{HTTP:Accept-encoding} !gzip
  RewriteRule .* - [S=2]
  #CSS
  RewriteCond %{REQUEST_FILENAME}\.gz -s
  RewriteRule (.*)\.css$ $1\.css\.gz [L,QSA,T=text/css]
  #JS
  RewriteCond %{REQUEST_FILENAME}\.gz -s
  RewriteRule (.*)\.js$ $1\.js\.gz [L,QSA,T=text/javascript]

Option 2

  <FilesMatch "\.gz$">
    AddEncoding x-gzip \.gz
  </FilesMatch>
  #CSS
  RewriteCond %{HTTP:Accept-encoding} gzip
  RewriteCond %{REQUEST_FILENAME}\.gz -s
  RewriteRule (.*)\.css$ $1\.css\.gz [L,QSA,T=text/css]
  #JS
  RewriteCond %{HTTP:Accept-encoding} gzip
  RewriteCond %{REQUEST_FILENAME}\.gz -s
  RewriteRule (.*)\.js$ $1\.js\.gz [L,QSA,T=text/javascript]

Option 3 (no escaped periods)

  <FilesMatch "\.gz$">
    AddEncoding x-gzip .gz
  </FilesMatch>
  #skip gzipped css/js files if browser doesn't accept gzip encoding
  RewriteCond %{HTTP:Accept-encoding} !gzip
  RewriteRule .* - [S=2]
  #CSS
  RewriteCond %{REQUEST_FILENAME}.gz -s
  RewriteRule (.*)\.css$ $1.css.gz [L,QSA,T=text/css]
  #JS
  RewriteCond %{REQUEST_FILENAME}.gz -s
  RewriteRule (.*)\.js$ $1.js.gz [L,QSA,T=text/javascript]

Which way would be preferred: 1, 2 or 3?

mikeytown2’s picture

Status: Needs work » Needs review
FileSize
2.69 KB

Gzip setting controlled by page_compression. Trying to make this as simple as possible.

hass’s picture

What will happen if the server do not support GZ compression or any of this Apache rewrite rules?

Owen Barton’s picture

FileSize
3.99 KB

The attached patch includes the same conditional checks as page gzipping, so should fail where php has no zlib. The .htaccess will only trigger for .gz files exist, so nothing should break there.

The only thing I was wondering is if the rewrite rules for this should also be within a block (which could also contain the mime rule) - because if rewrite is enabled but mime isn't it seems there is a possibility of the type being incorrect, right?

hass’s picture

This fails on IIS:

+<FilesMatch "\.js\.gz$">
+  ForceType text/javascript
+</FilesMatch>

But this will work:

+  # Serve gzip compressed js files
+  RewriteCond %{HTTP:Accept-encoding} gzip
+  RewriteCond %{REQUEST_FILENAME}\.gz -s
+  RewriteRule (.*)\.js$ $1\.js\.gz [L,QSA,T=text/javascript]

In such a case the ForceType may not send to the browser, but the compressed file is... I'm not sure if this could cause issues!?

mikeytown2’s picture

Due to apache weirdness I'm setting the type twice. Once globally in the FilesMatch argument; the other time with T=text/javascript in the RewriteRule. That seems to work 100% of the time on various versions of apache. Does IIS work with T= ?

hass’s picture

mikeytown2’s picture

FileSize
4.13 KB

hass, good to hear that works. Here's a slight adjustment to #46, adjusted the regular expression for the rewrite rule so it compares from the start, instead of from the end. Should always work with *.css?5 and things like that. The reason I need both T=MIME-type and the FilesMatch argument is because the newest version of apache doesn't work with T=MIME-type; well you can get it to work if you do some ugly tricks, so FilesMatch is the best way to get around the apache inconsistencies.
http://httpd.apache.org/docs/trunk/mod/mod_rewrite.html#rewriterule

'type|T=MIME-type' (force MIME type)
Force the MIME-type of the target file to be MIME-type. This can be used to set up the content-type based on some conditions. If used in per-directory context, use only - (dash) as the substitution, otherwise the MIME-type set with this flag is lost due to an internal re-processing.

http://httpd.apache.org/docs/2.2/mod/mod_rewrite.html#rewriterule

'type|T=MIME-type' (force MIME type)
Force the MIME-type of the target file to be MIME-type. This can be used to set up the content-type based on some conditions. For example, the following snippet allows .php files to be displayed by mod_php if they are called with the .phps extension:

Status: Needs review » Needs work

The last submitted patch failed testing.

mikeytown2’s picture

Status: Needs work » Needs review

Another reason why this should be in core
http://code.google.com/speed/articles/gzip.html

Above patch should still be good... re-submitting it.

Status: Needs review » Needs work

The last submitted patch failed testing.

andypost’s picture

Status: Needs work » Needs review

Suppose bot was broken, so re-test

meba’s picture

Works for me. What happens if the user doesn't change .htaccess as this file is usually kept? I understand that nothing will happen for him and he won't see a difference.

Can the new rewrite rules negatively affect performance?

mikeytown2’s picture

Status: Needs review » Reviewed & tested by the community

@meba
if the .htaccess file is not changed, then nothing happens; gzipped files do not get served. Performance impact is minimal; I haven't done any benchmarking though; It's hard for me to get access to an idle computer (my benchmarking results would be very unreliable for this small of a change).

meba’s picture

Status: Reviewed & tested by the community » Needs review

Would it be possible to decide whether to serve gzipped file directly in Drupal? We should have accept-encoding header available when printing styles.

mikeytown2’s picture

I don't understand your question, but here's an answer that hopefully answers what you where trying to say.

The new rules look to see if a css.gz file exists; if it does, that gets sent to the browser. If aggregation is turned off, and you compressed the css files your self, it will serve those as well (css.gz files in themes, ect...). If aggregation is on, printing styles get aggregated; thus they get gzip compressed.

meba’s picture

I know how does it work. What I mean is that we already check for gzip encoding capability in bootstrap.inc.

Why don't we just use that information (make return_compressed drupal static?) or simply check again in drupal_get_css and link to gzipped files without a need for additional rewrite rules?

bootstrap.inc:

  $return_compressed = $page_compression && isset($_SERVER['HTTP_ACCEPT_ENCODING']) && strpos($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip') !== FALSE;

drupal_get_css():

    if ($is_writable && $preprocess_css) {
      // Prefix filename to prevent blocking by firewalls which reject files
      // starting with "ad*".
      $filename = 'css_' . md5(serialize($types) . $query_string) . '.css';
// COULD RETURN GZIPPED FILEPATH IF SUPPORTED?
      $preprocess_file = drupal_build_css_cache($types, $filename);
      $output .= '<link type="text/css" rel="stylesheet" media="' . $media . '" href="' . file_create_url($preprocess_file) . '" />' . "\n";
    }

mikeytown2’s picture

Your proposal would make the page cache more complicated. it would require 2 different html sets; one with style.css, another with style.css.gz. It would also be incompatible with older webkit browsers (Apple Safari).

andypost’s picture

Status: Needs review » Reviewed & tested by the community

@meba anyway we need send specific header for css so this produceв by rewrite rules

Agree with #60, aggregation could be changed but all cached pages,css,js always cleared by system_clear_cache_submit at admin/settings/performance

But UI changes fro different issue

hass’s picture

Have one of you verified this in IIS with Helicons ISAPI Rewrite? We are trying to do the same in our non Drupal app and it's not working.

May be related to missing IIS support of:

AddEncoding x-gzip .gz
mikeytown2’s picture

@hass
Can you send out custom content encoding types in IIS? I don't use IIS so you're sorta on your own.

hass’s picture

We tried to do the same in our ColdFusion application on IIS6, but we haven't found a solution yet. I haven't found the time to test the above patch, but I expect after our tests this will result in a broken Drupal site (the gzip'ed CSS/JS files are send to the browser without encoding gzip).

The ISAPI_Rewrite solution from Helicon-Tech do not have the ability to send responds headers. Helicon "Ape" seems to have such a feature, but is only implemented well on IIS7, plus I do not have "Ape". Not sure if there are other ISAPI add-ons for IIS available with such a functionality...

andypost’s picture

So maybe better to return to different naming css.gz replace with gz.css

hass’s picture

How should this help?

andypost’s picture

I have no env to test rewrite with IIS

sun’s picture

Status: Reviewed & tested by the community » Needs work

This needs to be togglable separate from page compression. People might want to use different methods for compression of page requisites/support files (such as mod_deflate).

andypost’s picture

UI rework is here #459980: Rework admin/settings/performance

But this issue mostly about css-js compression

mikeytown2’s picture

Status: Needs work » Needs review
FileSize
4.81 KB

@hass
Have you looked at The Microsoft URL Rewrite Module for IIS 7.0

@sun
Good point. I was kind of hoping that #459980: Rework admin/settings/performance would be in by now... anyway here is the patch with more fine-grained control.

mikeytown2’s picture

FileSize
4.83 KB

here's one that defaults to FALSE just in case that is the desired way to do this.

sun’s picture

+++ .htaccess	8 Aug 2009 03:21:52 -0000
@@ -7,6 +7,16 @@
+# Gzip compressed css files are of the type 'text/css'.
...
+# Gzip compressed js files are of the type 'text/javascript'.

"CSS" and "JS" should be written all-upper-case here.

+++ .htaccess	8 Aug 2009 03:21:52 -0000
@@ -89,6 +105,16 @@ DirectoryIndex index.php index.html inde
+  RewriteRule ^(.*)\.css $1\.css\.gz [L,QSA,T=text/css]
...
+  RewriteRule ^(.*)\.js $1\.js\.gz [L,QSA,T=text/javascript]

I see a conversion from .* to .*.gz, but where are those (page) requests handled?

+++ includes/common.inc	8 Aug 2009 03:21:54 -0000
@@ -2627,6 +2627,10 @@ function drupal_build_css_cache($css, $f
+    if (variable_get('css_gzip_compression', TRUE) && function_exists('gzencode') && zlib_get_coding_type() == FALSE) {
@@ -3324,6 +3328,10 @@ function drupal_build_js_cache($files, $
+    if (variable_get('js_gzip_compression', TRUE) && function_exists('gzencode') && zlib_get_coding_type() == FALSE) {

Oh my - with this patch applied, we have this (complex) check at several places throughout core.

At the very least, we want to add a comment what is being checked here and more importantly, why.

Ideally, though, we want to create a helper function that performs those conditions centrally and just returns the result.

+++ modules/system/system.admin.inc	8 Aug 2009 03:21:55 -0000
@@ -1373,6 +1373,13 @@ function system_performance_settings() {
+  $form['bandwidth_optimizations']['css_gzip_compression'] = array(
...
+    '#disabled' => !variable_get('preprocess_css', 0),
@@ -1381,6 +1388,13 @@ function system_performance_settings() {
+  $form['bandwidth_optimizations']['js_gzip_compression'] = array(
...
+    '#disabled' => !variable_get('preprocess_js', 0),

This means I can't enable bandwith optimizations while enabling preprocessing optimizations? That's ugly UX.

23 days to code freeze. Better review yourself.

mikeytown2’s picture

FileSize
5.01 KB

I've addressed your first 2 concerns.


Ideally, though, we want to create a helper function that performs those conditions centrally and just returns the result.

This could be useful... this is what core does in drupal_page_set_cache()

    if (variable_get('page_compression', TRUE) && function_exists('gzencode')) {
      // We do not store the data in case the zlib mode is deflate. This should
      // be rarely happening.
      if (zlib_get_coding_type() == 'deflate') {
        $cache_page = FALSE;
      }
      elseif (zlib_get_coding_type() == FALSE) {
        $cache->data = gzencode($cache->data, 9, FORCE_GZIP);
      }
      // The remaining case is 'gzip' which means the data is already
      // compressed and nothing left to do but to store it.
    }

Whats your recommendation on this?


This means I can't enable bandwidth optimizations while enabling preprocessing optimizations? That's ugly UX.

How should it be done? Gzip does nothing unless it has aggregated files to play with. Gzip only works if Optimize is enabled, and that is how it's setup.
'#disabled' not '#enabled' ;)

sun’s picture

As far as I get it: #disabled _really_ only is when: (function_exists('gzencode') && zlib_get_coding_type() == FALSE) === TRUE

As far as the form is concerned, I'd just go with above condition for #disabled. And let the UX team figure out how to handle dependent configuration settings like this - in a separate issue. (please)

However, I still don't get how *.css.gz or *.js.gz files are handled on the server-side, based on the rewrite rules applied here. Perhaps, I'm just dumb, but it seems like *.css is rewritten to *.css.gz, and Apache will expect those files already exist, while I'm too blind to see a condition that checks whether they really exist?

+++ .htaccess	8 Aug 2009 04:25:43 -0000
@@ -89,6 +105,16 @@ DirectoryIndex index.php index.html inde
+  RewriteCond %{REQUEST_FILENAME}\.gz -s
...
+  RewriteCond %{REQUEST_FILENAME}\.gz -s

Or is it the -s option?

All we could do about those aforementioned compression conditions is to create a new drupal_server_compression_enabled() function. Not sure whether that makes ultimately sense, and most probably, this should be deferred to a separate issue... :-/

mikeytown2’s picture

FileSize
5.1 KB

Good point with #disabled. That should probably be added to the page_compression checkbox as well.

  $form['page_cache']['page_compression'] = array(
    '#type' => 'radios',
    '#title' => t('Page compression'),
    '#default_value' => variable_get('page_compression', TRUE),
    '#options' => array(t('Disabled'), t('Enabled')),
    '#description' => t("By default, Drupal compresses the pages it caches in order to save bandwidth and improve download times. This option should be disabled when using a webserver that performs compression."),
  );

Apache: It's the -s option. that checks that the file exists and filesize > 0, which all valid gzipped files should be, even if empty.

Also word wrapped the long comment inside the drupal_build_*_cache() functions.

kika’s picture

For binary choices I would go for simpler checkbox:

[ ] Enable page compression

Status: Needs review » Needs work

The last submitted patch failed testing.

mikeytown2’s picture

Status: Needs work » Needs review
FileSize
5.13 KB

#459980: Rework admin/settings/performance went in, so I raised up some issues with it.

Here's the patch again. After encountering a misconfigured server on the boost forum, I did some research and almost no one sends out x-gzip anymore, so i switched it to gzip going against the apache documentation (ie 5.5 sends out gzip, so I think it's stuff like Netscape 3 that uses x-gzip).

mikeytown2’s picture

Found a use case where having this in core would be nice. #554146: CSS_GZip doesn't work with Pixture and other color-based themes.

Status: Needs review » Needs work

The last submitted patch failed testing.

mikeytown2’s picture

Assigned: Unassigned » mikeytown2
Status: Needs work » Needs review
FileSize
5.13 KB

re-rolled

doq’s picture

Status: Needs review » Needs work

aggergated -> aggregated

doq’s picture

1. What about renaming variable "css_gzip_compression" to "css_compression" (the same for js).
2. And "Gzip compress JavaScript aggergated files" to "Compress JavaScript aggregated files" (the same for css)?

3. Looks incorrect (see the comment):

+    $form['bandwidth_optimization']['js_gzip_compression'] = array(
+      '#type' => 'checkbox',
+      '#title' => t('Gzip compress JavaScript aggergated files'),
+      '#default_value' => variable_get('css_gzip_compression', FALSE), //// Should be js_gzip_compression
Owen Barton’s picture

Status: Needs work » Needs review
FileSize
5.02 KB

Updated patch to move the various rules into a single block, and ensure that they are all dependent on BOTH mod_rewrite and mod_mime being enabled (the previous patch broke if you had mod_rewrite enabled, mod_mime disabled, which while unusual is not impossible). I tried to improve the comments as well, to make some of the nuances mentioned in this thread clearer - I think it is certainly easier to see what is going on when the rules are in a single block. We could probably be even more detailed here if we wanted, but I don't want to make it unnecessarily verbose.

I left the settings in for now, but I think these are mostly unnecessary (I can't reproduce any problems even if I have AddOutputFilterByType DEFLATE in my .htaccess) and if we can add a check to see if the webserver (or a proxy) is already gzipping files, as suggested in #459980: Rework admin/settings/performance I think all these settings could be removed.

sinasalek’s picture

subscribing

mikeytown2’s picture

Assigned: mikeytown2 » Unassigned

+1 for Owen Barton's patch

Owen Barton’s picture

FileSize
3.15 KB

Looking up the reasoning behind the checkbox for page compression (http://drupal.org/node/121820 and a few others linked from that one) it seems pretty clear that all the problems that led to it's insertion were related to interaction with PHP's zlib.output_compression setting causing double compression of pages. In other words our js and css gzipping should not be affected by this issue, since they do not use the output buffer. Hence here is a patch with the GUI removed (I left the variable_get in there so someone could disable in settings.php $conf array if they really like). If a committer has a strong opinion that this needs a GUI then feel free to commit the previous patch, but my preference is "less is more" in this case.

doq’s picture

Status: Needs review » Needs work

doq’s picture

Status: Needs work » Needs review
andypost’s picture

Status: Needs review » Needs work

+1 to #87 - I see no need in setting, only if someone get a trouble with double compression can uncomment line in settings.php

So better to make a line with comment in default.settings.php

andypost’s picture

Status: Needs work » Needs review
Issue tags: +Needs usability review

If compression issue will be commited #43462: cache_set and cache_get base_url brokenosity
So we can omit this setting at all.

mattyoung’s picture

.

Status: Needs review » Needs work

The last submitted patch failed testing.

casey’s picture

Issue tags: +CSS aggregation

tagging (there are many topics about css aggregation)

Status: Needs work » Needs review
Issue tags: -Performance, -Needs usability review, -CSS aggregation

sinjab requested that failed test be re-tested.

sinjab requested that failed test be re-tested.

grendzy’s picture

Version: 7.x-dev » 8.x-dev
FileSize
634 bytes

Here's #36 in patch form for those that need a quick fix for 6/7.

andypost’s picture

Version: 8.x-dev » 7.x-dev

This is not API change, so we can still hope #87 commited

grendzy’s picture

Status: Needs review » Needs work

OK, in that case here's a review. +1 from me, this is RTBC except it needs a reroll due to a failed hunk in .htaccess.

I checked the response headers of the aggregated css and js files, and they were gzipped as expected. I also like the way the rewrite is written; it allows non-aggregated files to be compressed manually by creating the .gz file.

From a security perspective, I also verified that the rewriting doesn't allow a access control on private files to be bypassed. (for example, if there was /system/files/super_secret.js.gz, requesting the file without the .gz results in a 404).

The patch also passes the coder tests.

Owen Barton’s picture

Status: Needs work » Needs review
Issue tags: -Needs usability review +frontend performance, +JS aggregation
FileSize
4.67 KB

Here is an updated patch - the htaccess rules are unchanged.

This includes 2 tweaks to the test for when we should save gz files - the first is that it uses the updated "extension_loaded('zlib')" test for php gzip support (the same check we now use for page cache gzipping), the second is that we only save gzip versions if clean urls are enabled. This is because on some hosting setups the new rewrite rules could 404 on css and js if the RewriteBase needs to be set, and isn't - this way we get the added safety of the clean url tests, yet can still support this by default on the majority of sites.

I also included a block in the default.settings.php that describes what is happening, and allows the user to disable it if they choose. The only examples I could think of for disabling this would be if you have an external caching reverse proxy and want to unload gzipping to that box, or if you are using another webserver, and want to save the (tiny) load of gzipping files that will never be served.

mikeytown2’s picture

AddEncoding should probably be enclosed in <IfModule mod_mime.c> </IfModule>
http://httpd.apache.org/docs/trunk/mod/mod_mime.html#addencoding

Owen Barton’s picture

It is already :)

Owen Barton’s picture

On a different note, I just came across this, and I wonder if we need to tweak our headers slightly further:

Be aware of issues with proxy caching of JS and CSS files.
Some public proxies have bugs that do not detect the presence of the Content-Encoding response header. This can result in compressed versions being delivered to client browsers that cannot properly decompress the files. Since these files should always be gzipped by your server, to ensure that the client can correctly read the files, do either of the following:
* Set the the Cache-Control header to private. This disables proxy caching altogether for these resources. If your application is multi-homed around the globe and relies less on proxy caches for user locality, this might be an appropriate setting.
* Set the Vary: Accept-Encoding response header. This instructs the proxies to cache two versions of the resource: one compressed, and one uncompressed. The correct version of the resource is delivered based on the client request header. This is a good choice for applications that are singly homed and depend on public proxies for user locality.

sun’s picture

Status: Needs review » Needs work

I agree that we have to send proper Vary headers.

+++ .htaccess	9 Apr 2010 19:41:45 -0000
@@ -109,6 +109,34 @@ DirectoryIndex index.php index.html inde
+    # Serve gzip compressed CSS files as 'text/css' (for newer Apache).
...
+    # Serve gzip compressed JS files as 'text/javascript' (for newer Apache).

Will "for newer Apache" hold true in 2 years from now?

+++ includes/common.inc	9 Apr 2010 19:41:45 -0000
@@ -3135,6 +3135,15 @@ function drupal_build_css_cache($css, $f
+    // that rewrite rules are working) and he zlib extension is available then

@@ -4187,6 +4196,15 @@ function drupal_build_js_cache($files, $
+    // that rewrite rules are working) and he zlib extension is available then

*the

114 critical left. Go review some!

mikeytown2’s picture

@sun
Setting the content type using t=_ was modified to be unusable in apache trunk for some reason http://httpd.apache.org/docs/trunk/rewrite/flags.html#flag_t. The FilesMatch ForceType was pushed into core for apache 2 http://httpd.apache.org/docs/2.0/mod/core.html#forcetype.
Thus since some hosts out there (GoDaddy) still used apache 1.3, in order to make the rules work everywhere I use both settings.

Someone want to post the exact headers we need to send? I'll convert it into apache rules.

mikeytown2’s picture

Status: Needs work » Needs review
FileSize
5.04 KB
mikeytown2’s picture

Better htaccess rules IMHO

According to http://code.google.com/speed/page-speed/docs/caching.html#LeverageProxyC... It can be Vary or private, so it uses Vary if mod_headers is installed and falls back to the gzipped version being cached for 1 second (not cached in short).

Edit: Looks like A0 is valid, will use that after some feedback.

Gerhard Killesreiter’s picture

There's a typo (twice) in the help text:

"and he zlib extension"

mikeytown2’s picture

Status: Needs review » Needs work

The last submitted patch, drupal-101227-109-css-js-gzip.patch, failed testing.

mikeytown2’s picture

Status: Needs work » Needs review
FileSize
5.05 KB
grendzy’s picture

A quick benchmark comparison with mod_deflate shows this method to be roughly 3X faster.

On the downside, I couldn't get this patch to actually run (I had to manually create the .gz file).

I got the following notices: (odd though that the test bot didn't report any)

Notice: Undefined variable: data in drupal_pre_render_styles() (line 3121 of /Users/djtack/Sites/7-drupal/drupal/includes/common.inc).
Notice: Undefined variable: uri in drupal_pre_render_styles() (line 3121 of /Users/djtack/Sites/7-drupal/drupal/includes/common.inc).
The specified file temporary://fileO4ppC3 could not be copied, because the destination .gz is invalid. This is often caused by improper use of file_unmanaged_copy() or a missing stream wrapper.
Notice: Undefined variable: data in drupal_pre_render_styles() (line 3121 of /Users/djtack/Sites/7-drupal/drupal/includes/common.inc).
Notice: Undefined variable: uri in drupal_pre_render_styles() (line 3121 of /Users/djtack/Sites/7-drupal/drupal/includes/common.inc).
The specified file temporary://fileLsLaSC could not be copied, because the destination .gz is invalid. This is often caused by improper use of file_unmanaged_copy() or a missing stream wrapper.
Notice: Undefined variable: data in drupal_pre_render_styles() (line 3121 of /Users/djtack/Sites/7-drupal/drupal/includes/common.inc).
Notice: Undefined variable: uri in drupal_pre_render_styles() (line 3121 of /Users/djtack/Sites/7-drupal/drupal/includes/common.inc).
The specified file temporary://filexKeVWw could not be copied, because the destination .gz is invalid. This is often caused by improper use of file_unmanaged_copy() or a missing stream wrapper.
mikeytown2’s picture

Status: Needs review » Needs work

Looks like css processing was completely redone recently.

mikeytown2’s picture

Status: Needs work » Needs review
FileSize
6.44 KB

looks like some white space issues got in recently... anyway moved the logic to the correct location
http://api.drupal.org/api/function/drupal_build_css_cache/7
http://api.drupal.org/api/function/drupal_build_js_cache/7

I have tested this locally and verified that the code works

Owen Barton’s picture

I have performed some detailed benchmarking of this patch to show the improvement for site users.

Some details:

  • All of these numbers are for an initial page load (subsequent page loads would use cached resources much of the time).
  • The site has most core modules (and RTL) enabled, as well as some sample generated content/images to give a more realistic site. Most production sites would be significantly heavier than this however (CSS/JS weighing in at 2 or 3 hundred KB is not unheard of) and so actual savings would likely be larger. Page loads were anonymous visitors with page caching/gzipping and CSS/JS aggregation enabled.
  • The first batch of measurements was taken over a LAN connection (to prevent network effects) with artificial throttling and latency introduced using Charles proxy at the default setting of a moderate DSL connection (with 100% bandwidth available - so this would be similar to higher bandwidth but more contested connections). Tests were done with both Firefox and Chrome and averaged over multiple loads (10 for Firefox, 5 for Chrome) - 3 sets of waterfall screenshots of both are attached.
  • The second batch of measurements uses the Webpagetest.org site, with a nearby server-server connection, using the default throttle/latency setting that simulates a fast DSL connection.

Here are the results:

Charles web proxy results (256KB/s / 256KB/s, 40ms latency)
Time
Firefox Chrome
Before After Before After
4.6 sec 1.9 sec 4.9 sec 2.1 sec
Seconds saved 2.7 sec Seconds saved 2.8 sec
% faster 58.9% % faster 56.7%

Data

JS CSS JS+CSS
Total downloaded before: 88.7 KB 39.3 KB 128.0 KB
Total downloaded after: 30.0 KB 9.7 KB 39.7 KB
Size reduction 33.8% 24.6% 31.0%
Webpagetest.org results (1500Kbps / 384 Kbps, 50ms latency, IE7)
Stats Time Total data
Before http://www.webpagetest.org/result/100701_SFM/ 1.6 sec 154.0 KB
After http://www.webpagetest.org/result/100701_SFX/ 1.3 sec 65.0 KB
% Improvement 22.67% 57.8%

In summary - for the page measured with typical connection speeds, this patch gives a ~20-60% saving in page load performance for initial page loads (or other page loads where you download an aggregate), saving somewhere from ~0.3-3 seconds, depending on your connection speed. It results in a 69% reduction in bandwidth utilization for CSS/JS, and a 88% reduction in total page bandwidth utilization.

I have also attached an updated patch with some very minor tweaks to comment punctuation/phrasing.

sinasalek’s picture

That's really great, the size is reduced to more than half. Thanks owen for the benchmarks

mikeytown2’s picture

can we call this RTBC?

Owen Barton’s picture

@mikeytown2 - I think we are close. Could you take a look at this one?

I have tried to streamline the rules at bit, which should make it simpler and more robust (and a tad faster). As far as I can tell from my tests, functionality is unchanged. It is now using mod_rewrite for both the rewrites and setting the content type. I found this worked fine as long as the content-type is applied as a separate rule. The issue with T= in Apache trunk appears to have been fixed, AFAICT. I also switched to using mod_headers to add the Content-Encoding: gzip header.

Between the 2 of these changes this means we can drop the requirement for mod_mime (not that this was an issue, but it does simplify the rules a fair bit). Also, I think we need to wrap the entire block in IfModule mod_headers.c - the way the previous patch disabled all css/js caching seems much worse to me than getting non-gzip content but being allowed to cache. Finally, I added a snippet to INSTALL.txt to note that mod_headers needs to be enabled for this functionality to work - we should add a link to a supplemental documentation page once this goes in.

I have tested this on Ubuntu 10.04 and an older Redhat based server, as well as with the following versions of Apache:
2.0.63
2.2.4
2.2.14
2.3.6 (released 2010-6-18)

mikeytown2’s picture

Relying on mod_headers does allow one to make the rules a lot simpler. I think this is a good trade off; reason I say trade off is not every server has mod_headers enabled. I know this from my experience from the boost module.

Nice trick with this

RewriteRule \.css\.gz$ - [T=text/css]
RewriteRule \.js\.gz$ - [T=text/javascript]

Have you tested this with apache 1.3x; or at a minimum godaddy shared hosting? Last time I checked they are still using apache 1.3x and godaddy has a huge market share in terms of web hosting.

New rules look good from my point of view.

Owen Barton’s picture

In terms of mod_headers, I agree using it may reduce the reach of this patch somewhat. However given that mod_headers is pretty much the only way (AFAICT) to inject the Vary Accept-Encoding header it seems like we don't have a lot of choice, and so we might as well use it for the Content-Encoding header also. I did have a quick look at using mod_negotiation instead of these rules, however I think we would need to save 2 copies of the uncompressed filed with content type extensions (i.e. .js.js), and it was unclear if mod_negotiation would produce correct output (including the vary header).

I didn't test this on Apache 1.3, as it wasn't compiling properly on my Ubuntu box. Apache 1.3 does claim to support the T= flag, so in theory it should work. I don't have a GoDaddy account, but it does sound like a good place to test.

alanburke’s picture

Patch tested on Ubuntu 8.04, with Apache2.
Didn't have mod headers enabled initially, and it just serves the CSS file normally.

The patch worked fine.
I did encounter one issue, but it isn't a problem with the patch.
I had previously configured Apache to serve js and css compressed using.

#   GZip Js files
AddOutputFilterByType DEFLATE text/css application/x-javascript

After applying the patch, the pages appeared unstyled and js was broken,
as the css and js files had been compressed twice.
Once I worked out what was happening and disabled the rule in the apache conf file, all was good.

Owen Barton’s picture

So I figured out a neat fix for mod_deflate double gzipping content - we can add "E=no-gzip:1" to the content type rewrite rules. mod_deflate (if present & enabled) picks up the no-gzip environment variable and stops gzipping this request. The E= flag is present in Apache 1, and has the benefit that we don't need to depend on and check for mod_env/envif.

Also, while the major benefit of this patch is clearly for end users (see #115) I thought it would be fun to give it a quick server side benchmark too:
Without patch, no mod_deflate: 102007 bytes/req, 1232.03 req/sec, 0.812ms mean
Without patch, with mod_deflate: 33742 bytes/req, 110.56 req/sec, 9.045ms mean
With patch, no mod_deflate: 33675 bytes/req, 1371.40 req/sec, 0.729ms mean

As you can see, mod_deflate enabled creates over 10x more work than mod_deflate disabled (either with or without the patch). With the patch is a tad less server work than without, but it was not that significant in these tests. However we should bear in mind that these tests were over localhost loopback, and in reality there would probably be some scaling benefits in Apache being able to hand off the smaller gzip responses faster at client connection speeds, and hence sustain a higher request rate.

Finally, I managed to install Apache 1.3 with some old Debian packages, and can confirm that it works correctly with mod_headers both enabled (gzip content, headers are correct) and disabled (serves non-gzip content).

mikeytown2’s picture

Nice usage of the E flag :)Turning off mod_deflate after sending gzipped content is smart. Never would have thought about doing this since the documentation doesn't exactly spell out the no-gzip variable.
Edit: found it http://httpd.apache.org/docs/trunk/env.html#special

Looks good to me

Owen Barton’s picture

By the way the function of the no-gzip variable is also described directly on the mod_deflate page you linked to.

Owen Barton’s picture

Status: Needs review » Reviewed & tested by the community

From e-mail with mikeytown2, we both think this is RTBC.

sun’s picture

I agree.

+++ includes/common.inc	8 Jul 2010 05:43:46 -0000
@@ -4396,9 +4405,18 @@ function drupal_build_js_cache($files) {
     variable_set('drupal_js_cache_files', $map);

+++ sites/default/default.settings.php	8 Jul 2010 05:43:46 -0000
@@ -371,6 +371,21 @@ ini_set('session.cookie_lifetime', 20000
+# $conf['css_gzip_compression'] = FALSE;
+# $conf['js_gzip_compression'] = FALSE;

Much better variable names than previous patches mistakenly introduced! (we still need to fix those)

Powered by Dreditor.

mdupont’s picture

Subscribing. Would be nice to have in D7!

Dries’s picture

This looks good to me too. Could use some more eyes.

Damien Tournoud’s picture

I'm confused as why:

    # Serve correct content types, and prevent mod_deflate double gzip.
    RewriteRule \.css\.gz$ - [T=text/css,E=no-gzip:1]
    RewriteRule \.js\.gz$ - [T=text/javascript,E=no-gzip:1]

... actually works when:

    RewriteRule ^(.*)\.css $1\.css\.gz [L,QSA]

    RewriteRule ^(.*)\.js $1\.js\.gz [L,QSA]

... are supposed to be the last rewrite rules to be called (L).

But I tested this patch and the behavior is correct. Ready to go in for me.

sun’s picture

Status: Reviewed & tested by the community » Needs work

Am I right in assuming that the earlier rules are targeting already existing files and the following rules after are setting headers for not yet existing files?

Either way, seems like we need better comments in .htaccess then.

Owen Barton’s picture

Status: Needs work » Needs review
FileSize
6 KB

The "L" flag does not work how you imagine (I had the same impression until I wrote this patch!) - it only prevents the same specific rule applying again on internal sub-requests, not subsequent rules that match different things (that may still apply): http://www.colder.ch/news/01-26-2007/24/truth-about-the-last-mod_.html

I added a comment to this effect: " # This matches the internal sub-request, triggered by the above 2 rewrites."

sun’s picture

+++ .htaccess	22 Sep 2010 16:19:39 -0000
@@ -109,6 +109,32 @@ DirectoryIndex index.php index.html inde
+  <IfModule mod_headers.c>
+    # Rules to correctly serve gzip compressed CSS and JS files.
+    # Requires both mod_rewrite and mod_headers to be enabled.
+    # Serve gzip compressed CSS files if they exist and the client accepts gzip.

Minor: Could we move the first two lines above the IfModule? They seem to be the heading for the entire block.

Powered by Dreditor.

Owen Barton’s picture

As the above link suggests, we should be able to drop the "L" rule completely (and hence the comment), which gives Apache a smidge less work to do. I haven't tested this all on the various environments and Apache versions I did with the last patch (see #118 and #112), but it does work identically on my Ubuntu 10.04 box, and I am pretty sure it should behave the same elsewhere (the first rule transforms the path, so it really should ever apply again).

Owen Barton’s picture

This one keeps the dropped L, and incorporates Sun's suggestion from #132.

sun’s picture

Status: Needs review » Reviewed & tested by the community

Thanks. Dropping L also makes sense to me.

Damien Tournoud’s picture

Just to clarify, any reason why the Multiviews feature cannot work in that case?

Owen Barton’s picture

It is possible to do similar things with Multiviews, and I did experiment with it quite a bit when playing with earlier patches. My conclusion was that it is kind of a blunt instrument - it was hard to get it actually doing something in the first place, as well as to apply to just the subset of requests we need (we don't want it messing with uploaded files that happen to end in .gz or .fr or whatever, and also it carries a performance hit for large directories) as well as getting the headers and mod_deflate interaction right. Using type-map is another alternative, but there didn't seem to be an elegant way of managing the map files. Using a php wrapper is another alternative (the one I first proposed 3.5 years ago!), which works fine functionally, but is slower than an Apache based solution. In the end, I think plain old rewrite does the job best - it is functional, fast and flexible.

sun’s picture

sun’s picture

Version: 7.x-dev » 8.x-dev

Although badly needed, this is D8 material according to the rules (I had to learn today). It may be backported at a later point in time (though that's unlikely).

mikeytown2’s picture

Another module I need to port to 7 now. I would really like it if core handled this; this patch is the more efficient and reliable solution.

geerlingguy’s picture

Version: 8.x-dev » 7.x-dev

Umm... patch looks good to me. I can't see why it can't get into 7.x, especially since it will be such an awesome tool for lower-end/shared hosting users.

XiaN Vizjereij’s picture

Patch worked fine for me too.

And i don't see a reason to delay it until 8.x either. Most coredevs approved, the patch even got some polished documentation in the files and the results are outstanding ( at least on my shared host environment ).

+1 for drupal 7.0 beta 1 !

Owen Barton’s picture

@Dries - we have had 4 reviews/OKs on the patch since your last comment - is there anything else you are looking for here? I think this patch gives a significant speed-up, and reducing the download size nicely compliments the new aggregation info/grouping strategy (which it should be noted can increase initial visit download sizes somewhat, with the benefit that it allows browser caching to work correctly).

XiaN Vizjereij’s picture

To compliment that, after applying this patch to my drupal 7 test site my co-devs commented in irc about that the site feels faster at the initial load ( and i didn't say a word about that patch! ). So thanks again for all patch suppliers and hope to see it in D7 stable.

Dries’s picture

Status: Reviewed & tested by the community » Fixed

OK, committed to CVS HEAD. Thanks.

Status: Fixed » Closed (fixed)

Automatically closed -- issue fixed for 2 weeks with no activity.

catch’s picture

There are a number of server-side performance issues that this introduced, which didn't get much discussion here, I've opened #1040534: Rewrite rules for gzipped CSS and JavaScript aggregates cause lots of lstats for files that will never exist to work on those.