Page 1 of 2
CSS Compression
Posted: Sat Jun 23, 2007 3:49 pm
by Luke
I'm trying to reduce the overhead on some of my sites by reducing the size of my css files and how many of them are called on each request. I'd also like to attempt some gzip compression on them as well. What I have done so far to compress the size of the CSS file is just replace any duplicate whitespace characters with just one... so something like this:
Code: Select all
<?php
/* ... snip ... */
$newcss = preg_replace('/\s+/', ' ', $css);
/* ... snip ... */
?>
I use significantly more whitespace than most people, so this actually does slim down my css file a little, but I was just wondering what else I can do if anything.
Posted: Sat Jun 23, 2007 3:54 pm
by John Cartwright
Are you using external css files? These are already cached by the browser, just so you know.
Posted: Sat Jun 23, 2007 4:11 pm
by superdezign
Jcart wrote:Are you using external css files? These are already cached by the browser, just so you know.
Yeah, that's why they're so great. If you change a CSS file and upload it, and continue browsing the site, you shouldn't see any changes. When you refresh the page, that's when the changes that you defined will occur.
CSS files are here to help the speed, not to hinder it. If anything, I'd start with the optimization of images.
Posted: Sat Jun 23, 2007 4:18 pm
by Luke
I guess I should explain the whole story. I was inspired by an article I read at A List Apart called "
Frameworks for Designers". I read this article and decided to put together a framework of my own. So, as I started doing so, I realized that I could put together 3 or 4 of my most common layouts that all use the same basic html template. So, I did just that and put them all in their own files as well as typography, browser default resets, forms, and a handful of others. Then I have a base CSS file that loads all of the aspects I'll need:
Code: Select all
/***** Base CSS Style ******/
/* Inspired by A List Apart*/
@import "common/screen.reset.css";
@import "common/screen.type.css";
@import "common/screen.form.css";
@import "common/screen.two-column-layout.css";
/**
* Put common colors here
* ------------------------
* Black : #000000
*
* Beige colors
* ------------------------
* BG Light : #F3E7BD
*
* Browns
* ------------------------
* Main brown : #392E24
*
*/
/*** Put customized styling below ***/
So I was thinking (since I'm learning python) that I'd write a simple little python application to build me a website framework based on whatever aspects I told it I need. I figured while I'm at it, why not grab all the css files I need and put them into one compressed css file when it generates the template?
Code: Select all
/***** Base CSS Style ******/
/* Inspired by A List Apart*/
@import "common/generated-compressed-framework.php"; /* gzipped css */
/**
* Put common colors here
* ------------------------
* Black : #000000
*
* Beige colors
* ------------------------
* BG Light : #F3E7BD
*
* Browns
* ------------------------
* Main brown : #392E24
*
*/
/*** Put customized styling below ***/
The python script would also generate my directory structure since it rarely changes from site to site.
Posted: Sat Jun 23, 2007 4:24 pm
by superdezign
I skimmed that article before nearly falling asleep in my chair.
It's interesting to put them together (I didn't know that was what you meant), but wouldn't that cause the browser to reload them at every page load? Or are you putting all of the CSS files into one?
Posted: Sat Jun 23, 2007 4:30 pm
by superdezign
http://alistapart.com/articles/frameworksfordesigners wrote:On large, high-traffic sites, adding five more HTTP connections to every page view may result in angry system administrators. Two possible solutions to this are:
[...]
2. Have a server-side process that dynamically flattens the individual files into a single response. I’ve not seen this done, but it could be very efficient if done well.
I see.
Posted: Sat Jun 23, 2007 4:33 pm
by Luke
I'm having a difficult time with removing comments. My regex is catching anything from the first instance of /* to the very last instance of */ and I know why (because it's greedy), but how do I fix it?
The greedy one:
My attempt at making it not greedy:
Suggestions? All the asterisks and slashes are making my head hurt.

Posted: Sat Jun 23, 2007 4:51 pm
by superdezign
Posted: Sat Jun 23, 2007 7:45 pm
by Arawn
That's the Perl expression I use to strip comments. I'm using commas as regular expression delimiters here - they help ease the head ache.

Before I get there I replace all the css hacks and filters that look like comments with tokens using evaluated regular expressions, like:
Code: Select all
$css =~ s,/\*/\*//\*/,_put_token_value($&),sige; # Fabrice's Inversion
After the comments and whitespace are stripped, the hacks and filters are restored using another evaluated regular expressions.
Posted: Sat Jun 23, 2007 8:51 pm
by superdezign
To preserve hacks? You know, you could always just not use them. There's a "non-hack" way around pretty much everything in CSS.
Posted: Sat Jun 23, 2007 10:02 pm
by Arawn
superdezign wrote:To preserve hacks? You know, you could always just not use them. There's a "non-hack" way around pretty much everything in CSS.
Your assuming I wrote the CSS, which is not always the case.
Posted: Sun Jun 24, 2007 2:04 am
by Ollie Saunders
I read that article on a list a apart. I had sorta started doing a CSS framework before that but it inspired me also to do it properly. I implemented the flattening technique actually I found it made it harder to see which lines are coming from which files so I had to add heading comments, using lots of asterisks, for the start of each file.
Posted: Sun Jun 24, 2007 7:46 am
by The Phoenix
You asked for other things you can do:
1. Minimize the selectors. By being conservative in what you style, you reduce the number of selectors, and that reduces the overall size of the file.
2. GZIP compression.
http://paulstamatiou.com/2007/03/18/how ... even-more/ or
http://www.fiftyfoureleven.com/weblog/w ... zip-method
3. Better use of html. If you force yourself to use strict html, you will likely reduce your html size (compliant html is often smaller).
4. Image reduction. Use html backgrounds to replace large images of single colors, use lossy compression images in places where its not noticable, use pngcrush in places it is, and then reduce the number of images overall by using sliding doors:
http://alistapart.com/articles/slidingdoors/
5. Try a css compressor to get hints on how you could compress it further:
http://www.cssdrive.com/index.php/main/csscompressor/
Those should seriously put you far ahead of the curve in terms of optimization.
Posted: Sun Jun 24, 2007 10:30 am
by Oren
The Ninja Space Goat wrote:I'm having a difficult time with removing comments. My regex is catching anything from the first instance of /* to the very last instance of */ and I know why (because it's greedy), but how do I fix it?
The greedy one:
My attempt at making it not greedy:
Suggestions? All the asterisks and slashes are making my head hurt.

Hmm... Wouldn't that be way more expensive than the HTTP requests?
I suggest that you write your framework + a program that takes all the files you need for a site and combine them into 1 file. Then you take this file, save it, and use it on a specific site. Doing all this process/parsing over and over again is overkilling

Posted: Sun Jun 24, 2007 12:19 pm
by The Phoenix
Oren wrote:
Hmm... Wouldn't that be way more expensive than the HTTP requests?
I suggest that you write your framework + a program that takes all the files you need for a site and combine them into 1 file. Then you take this file, save it, and use it on a specific site. Doing all this process/parsing over and over again is overkilling

I don't disagree with your conclusion and suggestion, but I want to respond to the concept that led you to it.
There are two bottlenecks - network, and server.
On the server side is the processing of the scripts, the page, serving the pages (css & html), and so forth. The server side you have LOTS of control over. You can scale by using better hardware (memory, cpu(s), better hard drives, etc). You can scale by using more hardware (load balancing, round-robin, distributed SQL server, clustering, etc). You can even scale by making better use of the hardware (custom software for http (thttpd), accelerators, compilers, JIT, etc).
Server side is where the processing would be done if you real-time combined several files into one. You have the ability to scale your power to meet that increased processing if you choose to.
On the network side, you don't have much control. You can add more bandwidth, and decrease your latency to major destinations. But even if you buy an OC-12 at four data centers in the US to ensure you are 'close' to your US users, you have no control over their horrible dialup or wireless access. That access can slow a connection to a one-hop-away OC3 to a crawl - and you can't do anything about it.
Worse, in many cases, requests are done sequentially! Unless you have
http/1.1, and
request pipelining enabled, each bit of latency (lets say 50ms per hop) is multiplied by the number of elements on the page. The net result can be a HUGE hit in responsiveness.
Thats why there is (usually) a worthwhile trade-off in "moving" performance hits from the network side to the server side. While it seems to just be 'moving the pain around', it moves it to a place where you can do something about it. It also enables very fine grain control over your CSS styles.
That said, I agree with your conclusion. Its not worth the impact to maintain 3-5 separate CSS files. Sure, develop that way, but then merge them all into one file for production. Net result is a reduction in processing power and less http requests.