www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - css minification

reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
I just added 
https://github.com/D-Programming-Language/dlang.org/pull/770, which 
generates minified css files. This is because in the near future css 
files will become heftier (more documentation comments, more detailed 
styles etc).

The disadvantage is that now one needs to be online to generate 
documentation. Thoughts?


Andrei
Jan 16 2015
next sibling parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Friday, 16 January 2015 at 17:40:40 UTC, Andrei Alexandrescu 
wrote:
 I just added 
 https://github.com/D-Programming-Language/dlang.org/pull/770, 
 which generates minified css files. This is because in the near 
 future css files will become heftier (more documentation 
 comments, more detailed styles etc).

 The disadvantage is that now one needs to be online to generate 
 documentation. Thoughts?
I would advise against this. If added, it should be opt-in. I see the following issues: - The service in question might be occasionally down, or might be shut down completely at some point. This makes it an additional point of failure. - High website load or poor Internet speeds will increase the time needed to build the website. - We can't know for sure that at some point the owners of cssminifier.com won't decide to inject evil code in the output. Even if you trust the owners, the service might get hacked with the same outcome. - As the request goes over HTTP, you also need to trust every peer between your machine and the website to not perform a MITM attack. I think relying on a random 3rd-party service is acceptable for an amateur or hobbyist website, which we are not. Instead, I suggest adding an optional (opt-in) rule which invokes an offline CSS minifier.
Jan 16 2015
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 9:58 AM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 17:40:40 UTC, Andrei Alexandrescu wrote:
 I just added
 https://github.com/D-Programming-Language/dlang.org/pull/770, which
 generates minified css files. This is because in the near future css
 files will become heftier (more documentation comments, more detailed
 styles etc).

 The disadvantage is that now one needs to be online to generate
 documentation. Thoughts?
I would advise against this. If added, it should be opt-in.
Thanks for the feedback.
 I see the following issues:

 - The service in question might be occasionally down, or might be shut
 down completely at some point. This makes it an additional point of
 failure.
To counter that I was thinking of: curl -X POST -s --data-urlencode 'input $<' http://cssminifier.com/raw
$  || cp $< $ 
This is nice modulo possible delays due to slow timeouts. I should also add that we already connect online to fetch LATEST from github. Incidentally today that takes forever :o).
 - High website load or poor Internet speeds will increase the time
 needed to build the website.
Actually that's not the case in my experience. Did you build the site recently, i.e. with dub in tow? Build time is dominated by running dub, and it's impossible to run it on only the modified portions of the site. make -j on my laptop takes one full minute without doing anything online. Running a few curls in parallel with that have plenty of time to finish. Then dub generates 286 MB worth of stuff, which takes a long time to rsync. (For perspective: the pre-dub site, images and all, was 19 MB.) I'm not sure whether rsync gets further confused by the necessity of wiping all files and generating them again, whether they're identical or not. I confess I'm unhappy with dub. I was hoping for a turnkey solution that also opens up new possibilities. Sadly it adds its own problems, and those it solves I know how to do simpler with ddoc and tooling around it. As things are, I found myself with another project in my lap that I need to babysit.
 - We can't know for sure that at some point the owners of
 cssminifier.com won't decide to inject evil code in the output. Even if
 you trust the owners, the service might get hacked with the same outcome.
Good point. We could, of course, store the converted css and inspect it before uploading, but seems a bit like meteor insurance to me.
 - As the request goes over HTTP, you also need to trust every peer
 between your machine and the website to not perform a MITM attack.

 I think relying on a random 3rd-party service is acceptable for an
 amateur or hobbyist website, which we are not. Instead, I suggest adding
 an optional (opt-in) rule which invokes an offline CSS minifier.
That's a rather dim view of online services :o). What would be a trustworty offline CSS minifier? Overall I'm unconvinced of your arguments and I think we should move forward with https://github.com/D-Programming-Language/dlang.org/pull/770 or a variation thereof. Andrei
Jan 16 2015
next sibling parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Fri, Jan 16, 2015 at 10:16:55AM -0800, Andrei Alexandrescu via Digitalmars-d
wrote:
 On 1/16/15 9:58 AM, Vladimir Panteleev wrote:
On Friday, 16 January 2015 at 17:40:40 UTC, Andrei Alexandrescu wrote:
I just added
https://github.com/D-Programming-Language/dlang.org/pull/770, which
generates minified css files. This is because in the near future css
files will become heftier (more documentation comments, more
detailed styles etc).

The disadvantage is that now one needs to be online to generate
documentation. Thoughts?
I would advise against this. If added, it should be opt-in.
[...] I also advise against this. I do run documentation builds a lot when working on PRs -- to verify the generated docs are satisfactory, for example. Sometimes I work on PRs when I don't have a network connection. It would really, *really* suck if I can't do any useful work unless I have a stable connection. (Not to mention, sometimes I have to work behind firewalls, which can make otherwise reliable internet hosts unreliable.) Please make this opt-in. Otherwise I will really lose a lot of motivation to work on docs. It's already bad enough that the dlang.org repo depends on all sorts of external tools, like kindle, latex, etc., most of which I don't (immediately) care about. Fortunately, the 'html' target of the makefile allows me to successfully generate the HTML docs without installing all sorts of software that I otherwise never use. If you feel absolutely compelled to use an online minifier, can you at least make it possible to build the docs *without* it? And I mean a dedicated build target that bypasses that step completely; half-hearted hacks like waiting for the connection to timeout is not a viable option because it would just slow things down to the point I would probably just give up and throw away any doc contributions. T -- Nearly all men can stand adversity, but if you want to test a man's character, give him power. -- Abraham Lincoln
Jan 16 2015
prev sibling next sibling parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Friday, 16 January 2015 at 18:16:55 UTC, Andrei Alexandrescu 
wrote:
 What would be a trustworty offline CSS minifier?
http://yui.github.io/yuicompressor/ Its only dependency is Java. Usage: java -jar yuicompressor-*.jar --type css < input.css > output.css
Jan 16 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 12:27 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 18:16:55 UTC, Andrei Alexandrescu wrote:
 What would be a trustworty offline CSS minifier?
http://yui.github.io/yuicompressor/ Its only dependency is Java. Usage: java -jar yuicompressor-*.jar --type css < input.css > output.css
$ java No Java runtime present, requesting install. $ _ Andrei
Jan 16 2015
parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Friday, 16 January 2015 at 20:51:34 UTC, Andrei Alexandrescu 
wrote:
 On 1/16/15 12:27 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 18:16:55 UTC, Andrei 
 Alexandrescu wrote:
 What would be a trustworty offline CSS minifier?
http://yui.github.io/yuicompressor/ Its only dependency is Java. Usage: java -jar yuicompressor-*.jar --type css < input.css > output.css
$ java No Java runtime present, requesting install. $ _
No idea what that means. Google says it's a Mac problem. I am neither a Java nor Mac person, sorry.
Jan 16 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 12:56 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 20:51:34 UTC, Andrei Alexandrescu wrote:
 On 1/16/15 12:27 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 18:16:55 UTC, Andrei Alexandrescu wrote:
 What would be a trustworty offline CSS minifier?
http://yui.github.io/yuicompressor/ Its only dependency is Java. Usage: java -jar yuicompressor-*.jar --type css < input.css > output.css
$ java No Java runtime present, requesting install. $ _
No idea what that means. Google says it's a Mac problem. I am neither a Java nor Mac person, sorry.
That's why online services rok. Anyone knows of a secure css minimizing service? -- Andrei
Jan 16 2015
next sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Friday, 16 January 2015 at 21:04:58 UTC, Andrei Alexandrescu 
wrote:
 On 1/16/15 12:56 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 20:51:34 UTC, Andrei 
 Alexandrescu wrote:
 On 1/16/15 12:27 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 18:16:55 UTC, Andrei 
 Alexandrescu wrote:
 What would be a trustworty offline CSS minifier?
http://yui.github.io/yuicompressor/ Its only dependency is Java. Usage: java -jar yuicompressor-*.jar --type css < input.css
 output.css
$ java No Java runtime present, requesting install. $ _
No idea what that means. Google says it's a Mac problem. I am neither a Java nor Mac person, sorry.
That's why online services rok. Anyone knows of a secure css minimizing service? -- Andrei
I could put one up in a few minutes. But I think it'd be better if it was on dlang.org's servers. Now... does the server have Java installed? ;)
Jan 16 2015
prev sibling parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Friday, 16 January 2015 at 21:04:58 UTC, Andrei Alexandrescu 
wrote:
 That's why online services rok. Anyone knows of a secure css 
 minimizing service? -- Andrei
Minification in general is of dubious value, but doubly so with css, it barely makes a difference compared to gzip and client side caching. If comments are a problem, just write a little regex thingy to strip them out. You could do the same to leading and trailing whitespace on a line, though that's really negligible too. Or, you could use something like my css macro expand which transforms css in a useful way and strips out comments in the process. .foo { .nesting-supported-with-css-expand { } } http://code.dlang.org/packages/cssexpand
Jan 16 2015
prev sibling parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Friday, 16 January 2015 at 18:16:55 UTC, Andrei Alexandrescu 
wrote:
 I should also add that we already connect online to fetch 
 LATEST from github. Incidentally today that takes forever :o).
We (and a lot of other people) already trust GitHub. Also, GitHub is HTTPS-only.
 That's a rather dim view of online services :o).
This is not so much about online services in general, but more about online services created by some random guy which process code that will end up on our website. If the website was created by a generally trusted organization, transferred data securely, and/or operated on e.g. plain text, then some of my points would not apply.
 What would be a trustworty offline CSS minifier?
FWIW, YUI compressor is what I use for DFeed.
Jan 16 2015
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 9:58 AM, Vladimir Panteleev wrote:
[snip]

Just made css minification opt-in. -- Andrei
Jan 16 2015
parent reply "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Fri, Jan 16, 2015 at 10:37:38AM -0800, Andrei Alexandrescu via Digitalmars-d
wrote:
 On 1/16/15 9:58 AM, Vladimir Panteleev wrote:
 [snip]
 
 Just made css minification opt-in. -- Andrei
Thanks!! T -- "Maybe" is a strange word. When mom or dad says it it means "yes", but when my big brothers say it it means "no"! -- PJ jr.
Jan 16 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 10:40 AM, H. S. Teoh via Digitalmars-d wrote:
 On Fri, Jan 16, 2015 at 10:37:38AM -0800, Andrei Alexandrescu via
Digitalmars-d wrote:
 On 1/16/15 9:58 AM, Vladimir Panteleev wrote:
 [snip]

 Just made css minification opt-in. -- Andrei
Thanks!!
Glad it works for you. Vladimir, is opt-in okay with you as a first step toward a more secure solution? -- Andrei
Jan 16 2015
parent "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Friday, 16 January 2015 at 20:40:29 UTC, Andrei Alexandrescu 
wrote:
 On 1/16/15 10:40 AM, H. S. Teoh via Digitalmars-d wrote:
 On Fri, Jan 16, 2015 at 10:37:38AM -0800, Andrei Alexandrescu 
 via Digitalmars-d wrote:
 On 1/16/15 9:58 AM, Vladimir Panteleev wrote:
 [snip]

 Just made css minification opt-in. -- Andrei
Thanks!!
Glad it works for you. Vladimir, is opt-in okay with you as a first step toward a more secure solution? -- Andrei
I do not build the documentation often, so my post was an attempt at advice from experience rather than an objection based on personal motives. Generally, and as I mentioned at the top of my post, I think opt-in is acceptable, as the default no longer has risks of security or unexpected failure.
Jan 16 2015
prev sibling next sibling parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 1/16/15 12:40 PM, Andrei Alexandrescu wrote:
 I just added
 https://github.com/D-Programming-Language/dlang.org/pull/770, which
 generates minified css files. This is because in the near future css
 files will become heftier (more documentation comments, more detailed
 styles etc).

 The disadvantage is that now one needs to be online to generate
 documentation. Thoughts?
Almost all browsers support gzip transfer of files. You'd get much better mileage with just gzipping the file. -Steve
Jan 16 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 12:37 PM, Steven Schveighoffer wrote:
 On 1/16/15 12:40 PM, Andrei Alexandrescu wrote:
 I just added
 https://github.com/D-Programming-Language/dlang.org/pull/770, which
 generates minified css files. This is because in the near future css
 files will become heftier (more documentation comments, more detailed
 styles etc).

 The disadvantage is that now one needs to be online to generate
 documentation. Thoughts?
Almost all browsers support gzip transfer of files. You'd get much better mileage with just gzipping the file. -Steve
That's part of the protocol, right? We should be doing that anyway. Anyhow, the css is really hot and comments just add to it, compressed or not. -- Andrei
Jan 16 2015
parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 1/16/15 3:53 PM, Andrei Alexandrescu wrote:
 On 1/16/15 12:37 PM, Steven Schveighoffer wrote:
 On 1/16/15 12:40 PM, Andrei Alexandrescu wrote:
 I just added
 https://github.com/D-Programming-Language/dlang.org/pull/770, which
 generates minified css files. This is because in the near future css
 files will become heftier (more documentation comments, more detailed
 styles etc).

 The disadvantage is that now one needs to be online to generate
 documentation. Thoughts?
Almost all browsers support gzip transfer of files. You'd get much better mileage with just gzipping the file. -Steve
That's part of the protocol, right? We should be doing that anyway. Anyhow, the css is really hot and comments just add to it, compressed or not. -- Andrei
I think this is way over-optimization. If the system already sends gzipped, I don't think any kind of minification is going to improve enough to the point of justifying all this. (Dons Walter hat): have you profiled to see how much it saves? -Steve
Jan 16 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 1:13 PM, Steven Schveighoffer wrote:
 On 1/16/15 3:53 PM, Andrei Alexandrescu wrote:
 On 1/16/15 12:37 PM, Steven Schveighoffer wrote:
 On 1/16/15 12:40 PM, Andrei Alexandrescu wrote:
 I just added
 https://github.com/D-Programming-Language/dlang.org/pull/770, which
 generates minified css files. This is because in the near future css
 files will become heftier (more documentation comments, more detailed
 styles etc).

 The disadvantage is that now one needs to be online to generate
 documentation. Thoughts?
Almost all browsers support gzip transfer of files. You'd get much better mileage with just gzipping the file. -Steve
That's part of the protocol, right? We should be doing that anyway. Anyhow, the css is really hot and comments just add to it, compressed or not. -- Andrei
I think this is way over-optimization. If the system already sends gzipped, I don't think any kind of minification is going to improve enough to the point of justifying all this. (Dons Walter hat): have you profiled to see how much it saves?
Well good point. As of January two of the css files are in the top 3 most trafficked files off of dlang.org, second only to favicon.ico. Loading css accounts for 12.78% of all dlang.org hits and 5.73% of all dlang.org bytes transferred. I'd say improvements would be measurable. of all hits and 16.27% of all bytes transferred. Andrei
Jan 16 2015
next sibling parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Friday, 16 January 2015 at 21:26:04 UTC, Andrei Alexandrescu 
wrote:
 Well good point. As of January two of the css files are in the 
 top 3 most trafficked files off of dlang.org, second only to 
 favicon.ico.
That's probably because HTTP caching is not configured. Ideally, you'd put the file's modification time in its path, e.g.: <link rel="stylesheet" type="text/css" href="css/1421443851/style.css" /> css/*/style.css would point to the same style.css (via internal, not HTTP redirect). Then, css/* can be cached forever, as the URL of the file would change when the file changes. This is what DFeed does, but I'm not sure if this is feasible with just DDoc and makefiles, though.
Jan 16 2015
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 1:32 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 21:26:04 UTC, Andrei Alexandrescu wrote:
 Well good point. As of January two of the css files are in the top 3
 most trafficked files off of dlang.org, second only to favicon.ico.
That's probably because HTTP caching is not configured. Ideally, you'd put the file's modification time in its path, e.g.: <link rel="stylesheet" type="text/css" href="css/1421443851/style.css" /> css/*/style.css would point to the same style.css (via internal, not HTTP redirect). Then, css/* can be cached forever, as the URL of the file would change when the file changes. This is what DFeed does, but I'm not sure if this is feasible with just DDoc and makefiles, though.
Nice. Wanna take it up? Generally I'm looking for less work for me and more work for others :o). -- Andrei
Jan 16 2015
parent reply "Kiith-Sa" <kiithsacmp gmail.com> writes:
On Friday, 16 January 2015 at 21:39:52 UTC, Andrei Alexandrescu 
wrote:
 On 1/16/15 1:32 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 21:26:04 UTC, Andrei 
 Alexandrescu wrote:
 Well good point. As of January two of the css files are in 
 the top 3
 most trafficked files off of dlang.org, second only to 
 favicon.ico.
That's probably because HTTP caching is not configured. Ideally, you'd put the file's modification time in its path, e.g.: <link rel="stylesheet" type="text/css" href="css/1421443851/style.css" /> css/*/style.css would point to the same style.css (via internal, not HTTP redirect). Then, css/* can be cached forever, as the URL of the file would change when the file changes. This is what DFeed does, but I'm not sure if this is feasible with just DDoc and makefiles, though.
Nice. Wanna take it up? Generally I'm looking for less work for me and more work for others :o). -- Andrei
Also, -1 for CSS minification, if I'll ever want to make any CSS contibutions I'll start by looking at the CSS in the browser. +1 for gzip and caching. *don't even consider* microoptimizations like this if you're not even doing that yet, whatever gains you might get are negligible by comparison.
Jan 16 2015
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 2:02 PM, Kiith-Sa wrote:
 +1 for gzip and caching. *don't even consider* microoptimizations like
 this if you're not even doing that yet, whatever gains you might get are
 negligible by comparison.
I'm estimating about one third of 12%, or 4% of the total traffic. That's a hell of a ROI for one line of code. -- Andrei
Jan 16 2015
prev sibling parent "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net> writes:
On Friday, 16 January 2015 at 22:02:27 UTC, Kiith-Sa wrote:
 Also, -1 for CSS minification, if I'll ever want to make any 
 CSS contibutions I'll start by looking at the CSS in the 
 browser.
You'd use the DOM inspectors in this case, no? They don't care about the formatting of the CSS file.
Jan 17 2015
prev sibling parent reply Jacob Carlborg <doob me.com> writes:
On 2015-01-16 22:32, Vladimir Panteleev wrote:

 That's probably because HTTP caching is not configured.

 Ideally, you'd put the file's modification time in its path, e.g.:

 <link rel="stylesheet" type="text/css" href="css/1421443851/style.css" />
Or a hash of the file content in the filename. -- /Jacob Carlborg
Jan 17 2015
parent reply "Mengu" <mengukagan gmail.com> writes:
don't know if it's already said but if you are using nginx, 
there's a plugin for minification and builtin support for 
compressing html pages or static assets. therefore, nobody needs 
a third-party dependency for building the docs.
Jan 17 2015
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/17/15 4:58 AM, Mengu wrote:
 don't know if it's already said but if you are using nginx,
We use Apache as far as I know.
 there's a
 plugin for minification and builtin support for compressing html pages
 or static assets. therefore, nobody needs a third-party dependency for
 building the docs.
Opt-in is fine. Please review and pull: https://github.com/D-Programming-Language/dlang.org/pull/770 Andrei
Jan 17 2015
prev sibling next sibling parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 1/16/15 4:26 PM, Andrei Alexandrescu wrote:
 On 1/16/15 1:13 PM, Steven Schveighoffer wrote:
 On 1/16/15 3:53 PM, Andrei Alexandrescu wrote:
 On 1/16/15 12:37 PM, Steven Schveighoffer wrote:
 On 1/16/15 12:40 PM, Andrei Alexandrescu wrote:
 I just added
 https://github.com/D-Programming-Language/dlang.org/pull/770, which
 generates minified css files. This is because in the near future css
 files will become heftier (more documentation comments, more detailed
 styles etc).

 The disadvantage is that now one needs to be online to generate
 documentation. Thoughts?
Almost all browsers support gzip transfer of files. You'd get much better mileage with just gzipping the file. -Steve
That's part of the protocol, right? We should be doing that anyway. Anyhow, the css is really hot and comments just add to it, compressed or not. -- Andrei
I think this is way over-optimization. If the system already sends gzipped, I don't think any kind of minification is going to improve enough to the point of justifying all this. (Dons Walter hat): have you profiled to see how much it saves?
Well good point. As of January two of the css files are in the top 3 most trafficked files off of dlang.org, second only to favicon.ico. Loading css accounts for 12.78% of all dlang.org hits and 5.73% of all dlang.org bytes transferred. I'd say improvements would be measurable.
Well, this is looking at the wrong statistic. I don't care how much of the overall bandwidth it is, what I was asking is how much does the file shrink if you minify. Saving 1% file size isn't going to put any kind of dent in the traffic.

 of all hits and 16.27% of all bytes transferred.
On an embedded product we have with a dead-simple web server, there is terrible network performance. Adding gzip support saved way more than minification ever could. But the best performance improvement was to add caching support to the server. Both the browser and the server have to cooperate there. -Steve
Jan 16 2015
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 I don't care how much of the overall bandwidth it is, what I was asking
 is how much does the file shrink if you minify.
30% -- ANDREI
Jan 16 2015
parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 1/16/15 5:12 PM, Andrei Alexandrescu wrote:
 On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 I don't care how much of the overall bandwidth it is, what I was asking
 is how much does the file shrink if you minify.
30% -- ANDREI
so d-minified.css.gz is 30% smaller than d.css.gz? Just want to clarify. -Steve
Jan 16 2015
next sibling parent reply "Vladimir Panteleev" <vladimir thecybershadow.net> writes:
On Friday, 16 January 2015 at 22:17:51 UTC, Steven Schveighoffer 
wrote:
 On 1/16/15 5:12 PM, Andrei Alexandrescu wrote:
 On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 I don't care how much of the overall bandwidth it is, what I 
 was asking
 is how much does the file shrink if you minify.
30% -- ANDREI
so d-minified.css.gz is 30% smaller than d.css.gz? Just want to clarify. -Steve
Original Minified Uncompressed 16028 11959 gzip -9 4252 3194 Looks closer to 25%, but same ballpark.
Jan 16 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 2:26 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 22:17:51 UTC, Steven Schveighoffer wrote:
 On 1/16/15 5:12 PM, Andrei Alexandrescu wrote:
 On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 I don't care how much of the overall bandwidth it is, what I was asking
 is how much does the file shrink if you minify.
30% -- ANDREI
so d-minified.css.gz is 30% smaller than d.css.gz? Just want to clarify. -Steve
Original Minified Uncompressed 16028 11959 gzip -9 4252 3194 Looks closer to 25%, but same ballpark.
So then the two optimizations don't compete. I hate it when I'm right :o). -- Andrei
Jan 16 2015
parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 1/16/15 5:30 PM, Andrei Alexandrescu wrote:
 On 1/16/15 2:26 PM, Vladimir Panteleev wrote:
 On Friday, 16 January 2015 at 22:17:51 UTC, Steven Schveighoffer wrote:
 On 1/16/15 5:12 PM, Andrei Alexandrescu wrote:
 On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 I don't care how much of the overall bandwidth it is, what I was
 asking
 is how much does the file shrink if you minify.
30% -- ANDREI
so d-minified.css.gz is 30% smaller than d.css.gz? Just want to clarify. -Steve
Original Minified Uncompressed 16028 11959 gzip -9 4252 3194 Looks closer to 25%, but same ballpark.
So then the two optimizations don't compete. I hate it when I'm right :o). -- Andrei
If the CSS isn't frequently changing, a 4kb file should not comprise 5% of all traffic if caching is enabled. I certainly think this is well worth the optimization, though, 25% is a good improvement. -Steve
Jan 16 2015
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 2:35 PM, Steven Schveighoffer wrote:
 If the CSS isn't frequently changing, a 4kb file should not comprise 5%
 of all traffic if caching is enabled.
May be a sign we have lots of new visitors :o). -- Andrei
Jan 16 2015
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 2:17 PM, Steven Schveighoffer wrote:
 On 1/16/15 5:12 PM, Andrei Alexandrescu wrote:
 On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 I don't care how much of the overall bandwidth it is, what I was asking
 is how much does the file shrink if you minify.
30% -- ANDREI
so d-minified.css.gz is 30% smaller than d.css.gz? Just want to clarify.
No, I didn't do any compression. Please do it and beat me over the head with your pull request. -- Andrei
Jan 16 2015
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 On an embedded product we have with a dead-simple web server, there is
 terrible network performance. Adding gzip support saved way more than
 minification ever could. But the best performance improvement was to add
 caching support to the server. Both the browser and the server have to
 cooperate there.
Pretty cool. The problem I'm having right now is the following pattern: 1. I have a mini-idea that takes me minutes to implement and turns the ratchet in the right direction. 2. I post it here in the hope that others will build upon or come with better ideas. 3. I get feedback here that essentially demonstrates me that if I spent some hours or days on a small research project on a better idea, it would yield better results. I encourage anyone who has more expertise in this kind of stuff to help out with turning the mythical ratchet one tooth forward. Thanks! Andrei
Jan 16 2015
parent reply Steven Schveighoffer <schveiguy yahoo.com> writes:
On 1/16/15 5:23 PM, Andrei Alexandrescu wrote:
 On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 On an embedded product we have with a dead-simple web server, there is
 terrible network performance. Adding gzip support saved way more than
 minification ever could. But the best performance improvement was to add
 caching support to the server. Both the browser and the server have to
 cooperate there.
Pretty cool. The problem I'm having right now is the following pattern: 1. I have a mini-idea that takes me minutes to implement and turns the ratchet in the right direction.
At the cost of adding dependencies for builds, and requiring builds be done with Internet access. I don't think it's out of line to ask that if we are going to add extra build requirements, we should make sure it's really making decent progress.
 2. I post it here in the hope that others will build upon or come with
 better ideas.

 3. I get feedback here that essentially demonstrates me that if I spent
 some hours or days on a small research project on a better idea, it
 would yield better results.
I think you misunderstand. We are not saying "do a research project", it takes seconds to gzip 2 files (the minified and not minified) and see the size difference. If it's super-significant, let's go for it! If you send me the minified file, I can test it for you. There doesn't need to be any research, but all the suggestions that have been provided have NOT required extra tools or dependencies. That is a significant difference. -Steve
Jan 16 2015
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 2:32 PM, Steven Schveighoffer wrote:
 I think you misunderstand. We are not saying "do a research project", it
 takes seconds to gzip 2 files (the minified and not minified) and see
 the size difference. If it's super-significant, let's go for it! If you
 send me the minified file, I can test it for you.

 There doesn't need to be any research, but all the suggestions that have
 been provided have NOT required extra tools or dependencies. That is a
 significant difference.
The more involved part is figuring out what support is out there for compressed transfers, configuring the site, etc. --- Andrei
Jan 16 2015
parent Steven Schveighoffer <schveiguy yahoo.com> writes:
On 1/16/15 5:58 PM, Andrei Alexandrescu wrote:
 On 1/16/15 2:32 PM, Steven Schveighoffer wrote:
 I think you misunderstand. We are not saying "do a research project", it
 takes seconds to gzip 2 files (the minified and not minified) and see
 the size difference. If it's super-significant, let's go for it! If you
 send me the minified file, I can test it for you.

 There doesn't need to be any research, but all the suggestions that have
 been provided have NOT required extra tools or dependencies. That is a
 significant difference.
The more involved part is figuring out what support is out there for compressed transfers, configuring the site, etc. --- Andrei
Right, I understand. I'm not in a position to actually make those changes or support those servers, so I'll shut up. I was just saying my experience with what works and what doesn't with optimizing web traffic. -Steve
Jan 16 2015
prev sibling parent reply "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net> writes:
On Friday, 16 January 2015 at 22:32:07 UTC, Steven Schveighoffer 
wrote:
 On 1/16/15 5:23 PM, Andrei Alexandrescu wrote:
 On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 On an embedded product we have with a dead-simple web server, 
 there is
 terrible network performance. Adding gzip support saved way 
 more than
 minification ever could. But the best performance improvement 
 was to add
 caching support to the server. Both the browser and the 
 server have to
 cooperate there.
Pretty cool. The problem I'm having right now is the following pattern: 1. I have a mini-idea that takes me minutes to implement and turns the ratchet in the right direction.
At the cost of adding dependencies for builds, and requiring builds be done with Internet access. I don't think it's out of line to ask that if we are going to add extra build requirements, we should make sure it's really making decent progress.
Why do we need an external services? cat style.css | tr '\n' ' ' | sed 's/\/\*[^*]*\*\///g' | sed 's/\s\+/ /g' | sed 's/ \?\([(){},;]\) \?/\1/g Strictly speaking, this is overzealous (e.g. it also operates inside strings), and I didn't even test it, but it will probably work for almost all cases. The current main CSS file of dlang.org (style.css) shrinks from 14757 to 11720 bytes, a reduction of ~21%. But even writing a compressor in D should be trivial, as you'd only need a lexer.
 2. I post it here in the hope that others will build upon or 
 come with
 better ideas.

 3. I get feedback here that essentially demonstrates me that 
 if I spent
 some hours or days on a small research project on a better 
 idea, it
 would yield better results.
I think you misunderstand. We are not saying "do a research project", it takes seconds to gzip 2 files (the minified and not minified) and see the size difference. If it's super-significant, let's go for it! If you send me the minified file, I can test it for you. There doesn't need to be any research, but all the suggestions that have been provided have NOT required extra tools or dependencies. That is a significant difference. -Steve
Jan 17 2015
next sibling parent "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net> writes:
Another cheap addition, down to 11577 bytes:

     cat style.css |
         tr '\n' ' ' |
         sed 's/\/\*[^*]*\*\///g' |
         sed 's/\s\+/ /g' |
         sed 's/ \?\([(){},;]\) \?/\1/g |
         sed 's/;}/}/g'
Jan 17 2015
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/17/15 8:44 AM, "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net>" 
wrote:
 On Friday, 16 January 2015 at 22:32:07 UTC, Steven Schveighoffer wrote:
 On 1/16/15 5:23 PM, Andrei Alexandrescu wrote:
 On 1/16/15 1:44 PM, Steven Schveighoffer wrote:
 On an embedded product we have with a dead-simple web server, there is
 terrible network performance. Adding gzip support saved way more than
 minification ever could. But the best performance improvement was to
 add
 caching support to the server. Both the browser and the server have to
 cooperate there.
Pretty cool. The problem I'm having right now is the following pattern: 1. I have a mini-idea that takes me minutes to implement and turns the ratchet in the right direction.
At the cost of adding dependencies for builds, and requiring builds be done with Internet access. I don't think it's out of line to ask that if we are going to add extra build requirements, we should make sure it's really making decent progress.
Why do we need an external services? cat style.css | tr '\n' ' ' | sed 's/\/\*[^*]*\*\///g' | sed 's/\s\+/ /g' | sed 's/ \?\([(){},;]\) \?/\1/g Strictly speaking, this is overzealous (e.g. it also operates inside strings), and I didn't even test it, but it will probably work for almost all cases. The current main CSS file of dlang.org (style.css) shrinks from 14757 to 11720 bytes, a reduction of ~21%. But even writing a compressor in D should be trivial, as you'd only need a lexer.
Would be a nice tools/ thing. Wanna do it/ --- Andrei
Jan 17 2015
parent reply "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net> writes:
On Saturday, 17 January 2015 at 18:00:01 UTC, Andrei Alexandrescu 
wrote:
 Would be a nice tools/ thing. Wanna do it/ --- Andrei
First try: https://github.com/schuetzm/shrinkcss/blob/master/shrinkcss.d Reads from stdin, writes to stdout. Removes comments, superfluous whitespace, and the closing ";" before "}". WARNING: This is mostly untested, the lexer is probably missing a few corner cases.
Jan 18 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/18/15 9:41 AM, "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net>" 
wrote:
 On Saturday, 17 January 2015 at 18:00:01 UTC, Andrei Alexandrescu wrote:
 Would be a nice tools/ thing. Wanna do it/ --- Andrei
First try: https://github.com/schuetzm/shrinkcss/blob/master/shrinkcss.d Reads from stdin, writes to stdout. Removes comments, superfluous whitespace, and the closing ";" before "}". WARNING: This is mostly untested, the lexer is probably missing a few corner cases.
Noice. Would be interesting to see how much Brian's lexer generator would help. If it reduces size considerably or makes it a lot easier to do things, that would be a good proof of utility. -- Andrei
Jan 18 2015
parent "Marc =?UTF-8?B?U2Now7x0eiI=?= <schuetzm gmx.net> writes:
On Sunday, 18 January 2015 at 17:46:52 UTC, Andrei Alexandrescu 
wrote:
 On 1/18/15 9:41 AM, "Marc =?UTF-8?B?U2Now7x0eiI=?= 
 <schuetzm gmx.net>" wrote:
 On Saturday, 17 January 2015 at 18:00:01 UTC, Andrei 
 Alexandrescu wrote:
 Would be a nice tools/ thing. Wanna do it/ --- Andrei
First try: https://github.com/schuetzm/shrinkcss/blob/master/shrinkcss.d Reads from stdin, writes to stdout. Removes comments, superfluous whitespace, and the closing ";" before "}". WARNING: This is mostly untested, the lexer is probably missing a few corner cases.
Noice. Would be interesting to see how much Brian's lexer generator would help. If it reduces size considerably or makes it a lot easier to do things, that would be a good proof of utility. -- Andrei
It might help, maybe I'll try it out (not today, though). One improvement I'd like to have would be to avoid the duplication of the `case`s of `lex()` in the `readXXX()` functions.
Jan 18 2015
prev sibling parent reply "Kiith-Sa" <kiithsacmp gmail.com> writes:
I looked at the favicon, and...

the file is .ico (bad format), stores 5 versions of the icon 
(16x16 to 64x64) even though only 16x16/32x32 are supported.


Here are just the 16x16(383b) and 32x32(1.77kiB) versions, as 
PNGs (better compression than gif, and official standard - used 
RGBA, as 8-bit lost quality with almost identical size): 

pixel column from the 16x16 version because it was... 17x16.

See http://www.w3.org/2005/10/howto-favicon
Jan 16 2015
parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/16/15 1:54 PM, Kiith-Sa wrote:
 I looked at the favicon, and...

 the file is .ico (bad format), stores 5 versions of the icon (16x16 to
 64x64) even though only 16x16/32x32 are supported.


 Here are just the 16x16(383b) and 32x32(1.77kiB) versions, as PNGs
 (better compression than gif, and official standard - used RGBA, as
 8-bit lost quality with almost identical size):

 column from the 16x16 version because it was... 17x16.

 See http://www.w3.org/2005/10/howto-favicon
Thanks. Could you please Just Do It(tm) or paste this into a bug report. Thanks! -- Andrei
Jan 16 2015
prev sibling next sibling parent reply Rikki Cattermole <alphaglosined gmail.com> writes:
On 17/01/2015 6:40 a.m., Andrei Alexandrescu wrote:
 I just added
 https://github.com/D-Programming-Language/dlang.org/pull/770, which
 generates minified css files. This is because in the near future css
 files will become heftier (more documentation comments, more detailed
 styles etc).

 The disadvantage is that now one needs to be online to generate
 documentation. Thoughts?


 Andrei
For reference, Cmsed has a port of a css/javascript minifier. Feel free to extract it (its a subpackage so pretty easy). In fact I believe its one file.
Jan 16 2015
parent Steven Schveighoffer <schveiguy yahoo.com> writes:
On 1/16/15 6:39 PM, Rikki Cattermole wrote:
 On 17/01/2015 6:40 a.m., Andrei Alexandrescu wrote:
 I just added
 https://github.com/D-Programming-Language/dlang.org/pull/770, which
 generates minified css files. This is because in the near future css
 files will become heftier (more documentation comments, more detailed
 styles etc).

 The disadvantage is that now one needs to be online to generate
 documentation. Thoughts?


 Andrei
For reference, Cmsed has a port of a css/javascript minifier. Feel free to extract it (its a subpackage so pretty easy). In fact I believe its one file.
Just googled it. For further reading: https://github.com/rikkimax/Cmsed -Steve
Jan 16 2015
prev sibling parent reply "Sebastiaan Koppe" <mail skoppe.eu> writes:
On Friday, 16 January 2015 at 17:40:40 UTC, Andrei Alexandrescu 
wrote:
 I just added 
 https://github.com/D-Programming-Language/dlang.org/pull/770, 
 which generates minified css files. This is because in the near 
 future css files will become heftier (more documentation 
 comments, more detailed styles etc).

 The disadvantage is that now one needs to be online to generate 
 documentation. Thoughts?


 Andrei
I have taken a look at http://dlang.org and assessed some of the improvements to be made. I will probably step on someones toes, sorry, but that is just because I have big feet. A lot of people have already said this, but minification is the last thing on the list. My browser needs to parse 299kb to display the page. Javascript alone takes up 210kb of that. 131kb of that is uncompressed. 33kb is jQuery and 33kb is widget.js. That widget.js thing is probably because of the twitter stream on the right. A seasoned JS programmer can rewrite that stuff in about 6kb, if not less. jQuery is the enabler of all bad habits; best to remove it quickly, if only because of principles. If you really got But codemirror-compressed.js and run.js are by far the worst contenders and should be addressed as first. You can bring down loading times to half (yes, you read that correctly, you can cut 150kb). Besides, do you really need 100+kb of codemirror JS? What is it even doing? Even if you really need it, why not compress the thing? It takes around 4 lines in apache conf to accomplish this. Give me SSH access and I'll do it in under 2 min. Caching is the next trick in the list. Remember that ISP's, proxy, etc. may also cache your files, not just browsers. These are the files that are referenced by http://dlang.org and are not using caching (nor compression!): dlang.org/ codemirror.css style.css print.css codemirror-compressed.js run-main-website.js run.js search-left.gif search-button.gif dlogo.png gradient-red.jpg search-bg.gif Next point on the list is bundling resources. The browser can only load some much stuff async. If you have too much, part of those resource are going to be blocked. Which basically means you have another round-trip + data that you have to wait for. While there are some other optimizations to be made - like putting that twitter stream js at the bottom of the page - the point is this: If I wanted to optimize this website, minifying style.css would be the last thing on my list. Besides, minimizing CSS is tantamount to removing comments and whitespace. Like Adam Ruppe said, a regex program in D can accomplish that; no need to use a online service. It probably takes you more time to integrate the online service than to write the D program including the regex. At the end of the day, watching `mb-downloaded per resource per total` tells you nothing. What only matter is the time it takes for users to enter `http://dlang.org` in the browser, up until the time they get to see something they can click on. Being among this group of knowledgeable programmers it amazes me this site is still pre 2000. I for one, am required to use Lynx just because my eyes can't stand the design. OT: And lets be honest here, why the hell do we even use apache+php and not D+vibe.d? I just rewrote my companies corporate website in under 4 hours. Granted, it is a simple one. But this community should be able to rewrite this site in D in under a week, right?
Jan 17 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/17/15 10:01 AM, Sebastiaan Koppe wrote:
 On Friday, 16 January 2015 at 17:40:40 UTC, Andrei Alexandrescu wrote:
 I just added
 https://github.com/D-Programming-Language/dlang.org/pull/770, which
 generates minified css files. This is because in the near future css
 files will become heftier (more documentation comments, more detailed
 styles etc).

 The disadvantage is that now one needs to be online to generate
 documentation. Thoughts?


 Andrei
I have taken a look at http://dlang.org and assessed some of the improvements to be made. I will probably step on someones toes, sorry, but that is just because I have big feet.
Fantastic.
 A lot of people have already said this, but minification is the last
 thing on the list.
Measurements contradict that. Anyhow my point was one line of code reduces all site traffic by 5% - they call that a good day.
 My browser needs to parse 299kb to display the page. Javascript alone
 takes up 210kb of that. 131kb of that is uncompressed. 33kb is jQuery
 and 33kb is widget.js.

 That widget.js thing is probably because of the twitter stream on the
 right. A seasoned JS programmer can rewrite that stuff in about 6kb, if
 not less.
Great. You forgot to link to your pull request :o).
 jQuery is the enabler of all bad habits; best to remove it quickly, if
 only because of principles. If you really got addicted to that

I'm not an expert or an ideologist in the area. It was added by others who obviously have a different opinion from yours.
 But codemirror-compressed.js and run.js are by far the worst contenders
 and should be addressed as first. You can bring down loading times to
 half (yes, you read that correctly, you can cut 150kb). Besides, do you
 really need 100+kb of codemirror JS? What is it even doing? Even if you
 really need it, why not compress the thing? It takes around 4 lines in
 apache conf to accomplish this. Give me SSH access and I'll do it in
 under 2 min.
I'm working with our webmaster to create accounts for a few folks. For now you may want to send me what needs to be done and I'll take it with him. N.B. I vaguely recall I've tried that once but it was not possible for obscure reasons.
 Caching is the next trick in the list. Remember that ISP's, proxy, etc.
 may also cache your files, not just browsers.

 These are the files that are referenced by http://dlang.org and are not
 using caching (nor compression!):
 dlang.org/
 codemirror.css
 style.css
 print.css
 codemirror-compressed.js
 run-main-website.js
 run.js
 search-left.gif
 search-button.gif
 dlogo.png
 gradient-red.jpg
 search-bg.gif
Where should these be cached? I don't understand.
 Next point on the list is bundling resources. The browser can only load
 some much stuff async. If you have too much, part of those resource are
 going to be blocked. Which basically means you have another round-trip +
 data that you have to wait for.
Yah, we do a bunch of that stuff on facebook.com. It's significant work. Wanna have at it?
 While there are some other optimizations to be made - like putting that
 twitter stream js at the bottom of the page - the point is this: If I
 wanted to optimize this website, minifying style.css would be the last
 thing on my list.
Yah, the problem is everything on your list is hypothetical and not done, whereas css minimization is actual and done. Big difference. Very big difference.
 Besides, minimizing CSS is tantamount to removing comments and
 whitespace. Like Adam Ruppe said, a regex program in D can accomplish
 that; no need to use a online service. It probably takes you more time
 to integrate the online service than to write the D program including
 the regex.
Then do it.
 At the end of the day, watching `mb-downloaded per resource per total`
 tells you nothing. What only matter is the time it takes for users to
 enter `http://dlang.org` in the browser, up until the time they get to
 see something they can click on.
Agreed.
 Being among this group of knowledgeable programmers it amazes me this
 site is still pre 2000. I for one, am required to use Lynx just because
 my eyes can't stand the design.
Then improve it.
 OT:

 And lets be honest here, why the hell do we even use apache+php and not
 D+vibe.d? I just rewrote my companies corporate website in under 4
 hours. Granted, it is a simple one. But this community should be able to
 rewrite this site in D in under a week, right?
I wish. Andrei
Jan 17 2015
next sibling parent reply "Sebastiaan Koppe" <mail skoppe.eu> writes:
On Saturday, 17 January 2015 at 18:23:45 UTC, Andrei Alexandrescu 
wrote:
 On 1/17/15 10:01 AM, Sebastiaan Koppe wrote:
 A seasoned JS programmer can rewrite that stuff in about 6kb, 
 if not less.
Great. You forgot to link to your pull request :o).
Wait, one step back. I was still in assessment mode. I haven't committed to doing anything.
 jQuery is the enabler of all bad habits; best to remove it 
 quickly, if
 only because of principles. If you really got addicted to that

I'm not an expert or an ideologist in the area. It was added by others who obviously have a different opinion from yours.
Well, then they should use http://zeptojs.com/
 why not compress the thing? It takes around 4 lines in
 apache conf to accomplish this. Give me SSH access and I'll do 
 it in under 2 min.
I'm working with our webmaster to create accounts for a few folks. For now you may want to send me what needs to be done and I'll take it with him. N.B. I vaguely recall I've tried that once but it was not possible for obscure reasons.
I do not know the obscure reasons, but it should be as simple as: nano /etc/apache2/mods-enabled/deflate.conf <IfModule mod_deflate.c> AddOutputFilterByType DEFLATE text/html text/plain text/xml AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE application/x-javascript application/javacript application/ecmascript AddOutputFilterByType DEFLATE application/rss+xml AddOutputFilterByType DEFLATE application/json </IfModule> I know I am imposing on somebodies else's work here, but compressing resources should really be done.
 Caching is the next trick in the list. Remember that ISP's, 
 proxy, etc.
 may also cache your files, not just browsers.
Where should these be cached? I don't understand.
In the browser. So that on a reload of the page, the browser, instead of making HTTP calls, uses it's cache.
 Next point on the list is bundling resources. The browser can 
 only load
 some much stuff async. If you have too much, part of those 
 resource are
 going to be blocked. Which basically means you have another 
 round-trip +
 data that you have to wait for.
Yah, we do a bunch of that stuff on facebook.com. It's significant work. Wanna have at it?
Yes. Please. But the compression thing takes precedence.
 While there are some other optimizations to be made - like 
 putting that
 twitter stream js at the bottom of the page - the point is 
 this: If I
 wanted to optimize this website, minifying style.css would be 
 the last
 thing on my list.
Yah, the problem is everything on your list is hypothetical and not done, whereas css minimization is actual and done. Big difference. Very big difference.
True. If you can get a 4% difference by minimizing CSS, just do it. I am just saying you can do a lot better. Plus, I think with all the expertise around here, most of us who do web development, did this at one stage or the other.
 Besides, minimizing CSS is tantamount to removing comments and
 whitespace. Like Adam Ruppe said, a regex program in D can 
 accomplish
 that; no need to use a online service. It probably takes you 
 more time
 to integrate the online service than to write the D program 
 including
 the regex.
Then do it.
regex to select comments: /(\/\*[^(*\/)]+\*\/)/g regex to select whitespace: /(\s+)/g and then delete those. I know css minimizers do more, but if comments and whitespace is your issue, this does the trick.
 Being among this group of knowledgeable programmers it amazes 
 me this
 site is still pre 2000. I for one, am required to use Lynx 
 just because
 my eyes can't stand the design.
Then improve it.
Design is a *very* touchy issue. It is basically a matter of choice. Without a definite choice made, I won't waste my time improving it. The choice is very simple: keep it like it is, do what everybody else is doing
Jan 17 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/17/15 12:00 PM, Sebastiaan Koppe wrote:
 On Saturday, 17 January 2015 at 18:23:45 UTC, Andrei Alexandrescu wrote:
 On 1/17/15 10:01 AM, Sebastiaan Koppe wrote:
 I'm not an expert or an ideologist in the area. It was added by others
 who obviously have a different opinion from yours.
Well, then they should use http://zeptojs.com/
Of course. :o)
 why not compress the thing? It takes around 4 lines in
 apache conf to accomplish this. Give me SSH access and I'll do it in
 under 2 min.
I'm working with our webmaster to create accounts for a few folks. For now you may want to send me what needs to be done and I'll take it with him. N.B. I vaguely recall I've tried that once but it was not possible for obscure reasons.
I do not know the obscure reasons, but it should be as simple as: nano /etc/apache2/mods-enabled/deflate.conf <IfModule mod_deflate.c> AddOutputFilterByType DEFLATE text/html text/plain text/xml AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE application/x-javascript application/javacript application/ecmascript AddOutputFilterByType DEFLATE application/rss+xml AddOutputFilterByType DEFLATE application/json </IfModule> I know I am imposing on somebodies else's work here, but compressing resources should really be done.
Forwarded to our webmaster, thanks.
 Caching is the next trick in the list. Remember that ISP's, proxy, etc.
 may also cache your files, not just browsers.
Where should these be cached? I don't understand.
In the browser. So that on a reload of the page, the browser, instead of making HTTP calls, uses it's cache.
How do we improve that on our side?
 Next point on the list is bundling resources. The browser can only load
 some much stuff async. If you have too much, part of those resource are
 going to be blocked. Which basically means you have another round-trip +
 data that you have to wait for.
Yah, we do a bunch of that stuff on facebook.com. It's significant work. Wanna have at it?
Yes. Please. But the compression thing takes precedence.
Awesome. Don't forget you said this.
 regex to select comments: /(\/\*[^(*\/)]+\*\/)/g
 regex to select whitespace: /(\s+)/g

 and then delete those.
Tested PR or by the end of the day this will slide into obsolescence.
 Design is a *very* touchy issue. It is basically a matter of choice.
 Without a definite choice made, I won't waste my time improving it.
It's clear that once in a while we need to change the design just because it's old. Also, there are a few VERY obvious design improvements that need be done and would be accepted in a heartbeat, but NOBODY is doing them. I'm not an expert in design but I can tell within a second whether I like one. Yet no PR is coming for improving the design.
 The choice is very simple:

 keep it like it is,
 do what everybody else is doing
False choice. Andrei
Jan 17 2015
next sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
 I know I am imposing on somebodies else's work here, but compressing
 resources should really be done.
Our webmaster got back. He said compression is more CPU work and on a fat pipe (which we do have) that may make things actually worse. Also, how would this work if we switch to vibe.d? -- Andrei
Jan 17 2015
next sibling parent "H. S. Teoh via Digitalmars-d" <digitalmars-d puremagic.com> writes:
On Sat, Jan 17, 2015 at 12:52:29PM -0800, Andrei Alexandrescu via Digitalmars-d
wrote:
I know I am imposing on somebodies else's work here, but compressing
resources should really be done.
Our webmaster got back. He said compression is more CPU work and on a fat pipe (which we do have) that may make things actually worse. Also, how would this work if we switch to vibe.d? -- Andrei
+1 for dogfooding! T -- Notwithstanding the eloquent discontent that you have just respectfully expressed at length against my verbal capabilities, I am afraid that I must unfortunately bring it to your attention that I am, in fact, NOT verbose.
Jan 17 2015
prev sibling next sibling parent reply "Adam D. Ruppe" <destructionator gmail.com> writes:
On Saturday, 17 January 2015 at 20:52:28 UTC, Andrei Alexandrescu 
wrote:
 Our webmaster got back. He said compression is more CPU work 
 and on a fat pipe (which we do have) that may make things 
 actually worse.
Doing it on demand might be a mistake here, but we can also pre-compress the files since it is a static site. You just run gzip on the files then serve them up with the proper headers. here's a thing about doing it in apache http://stackoverflow.com/questions/75482/how-can-i-pre-compress-files-with-mod-deflate-in-apache-2-x
 Also, how would this work if we switch to vibe.d? -- Andrei
I don't know about vibe, but it is trivially simple in HTTP, so if it isn't supported there, it is probably a three (ish) line change. Caching is the same deal btw, just set the right header and you'll get a huge improvement. "Cache-control: max-age=36000" will cache it for ten hours, without even needing to change the urls. (Changing urls is nice because you can set it to cache forever and still get instantly visible updates to the user by changing the url, but we'd probably be fine with a cache update lag and it is simpler that way.) ETags are set right now and that does some caching, it could be improved further by adding the max-age bit tho. This is an apache config of some sort too. http://stackoverflow.com/questions/16750757/apache-set-max-age-or-expires-in-htaccess-for-directory though I don't agree it should be one year unless we're using different urls, we should do hours or days, but that's how it is done.
Jan 17 2015
next sibling parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On second thought this way works better:

http://stackoverflow.com/questions/7509501/how-to-configure-mod-deflate-to-serve-gzipped-assets-prepared-with-assetsprecom


though that's some ugly configuration, I hate apache.

But I just tested that locally and it all worked from a variety 
of user agents. All I had to do was gzip the file and also keep a 
copy of the uncompressed version to server to the (very few 
btw... but popular - curl, by default, is one of them) UAs that 
don't handle receiving gzipped info.
Jan 17 2015
prev sibling parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/17/15 4:22 PM, Adam D. Ruppe wrote:
 On Saturday, 17 January 2015 at 20:52:28 UTC, Andrei Alexandrescu wrote:
 Our webmaster got back. He said compression is more CPU work and on a
 fat pipe (which we do have) that may make things actually worse.
Doing it on demand might be a mistake here, but we can also pre-compress the files since it is a static site. You just run gzip on the files then serve them up with the proper headers.
Who's "you"? :o) -- Andrei
Jan 17 2015
parent "Adam D. Ruppe" <destructionator gmail.com> writes:
On Sunday, 18 January 2015 at 02:11:13 UTC, Andrei Alexandrescu 
wrote:
 Who's "you"? :o) -- Andrei
I'd do it myself, but after spending 30 minutes tonight trying and failing to get the website to build on my computer again tonight, I'm out of time. It really isn't hard though with access to the html and .htaccess or something. I just slapped this on my this-week-in-d local thingy: .htaccess: RewriteEngine on RewriteCond %{HTTP:Accept-Encoding} \b(x-)?gzip\b RewriteCond %{REQUEST_FILENAME}.gz -s RewriteRule ^(.+) $1.gz [L] <FilesMatch \.css\.gz$> ForceType text/css Header set Content-Encoding gzip </FilesMatch> <FilesMatch \.js\.gz$> ForceType text/javascript Header set Content-Encoding gzip </FilesMatch> <FilesMatch \.rss\.gz$> ForceType application/rss+xml Header set Content-Encoding gzip </FilesMatch> ExpiresActive on ExpiresDefault "access plus 1 days" Then ran $ for i in *.html *.css *.rss *.js; do gzip "$i"; zcat "$i.gz" > "$i"; done; (gzip replaces the original file so i just uncompressed it again after zipping with zcat. idk if there's a better way, the man page didn't give a quick answer so i just did it his way) and the headers look good now. So like if that can be done on dlang.org too it should hopefully do the trick.
Jan 17 2015
prev sibling parent reply "Sebastiaan Koppe" <mail skoppe.eu> writes:
On Saturday, 17 January 2015 at 20:52:28 UTC, Andrei Alexandrescu
wrote:
 I know I am imposing on somebodies else's work here, but 
 compressing
 resources should really be done.
Our webmaster got back. He said compression is more CPU work and on a fat pipe (which we do have) that may make things actually worse. Also, how would this work if we switch to vibe.d? -- Andrei
If you do not have spare horsepower for compression, how will you handle twice the load? I have used vibe.d to fetch gzipped resources, it has all the deflate&inflate stuff, so delivering gzipped resources should be easy as flipping a switch.
Jan 17 2015
parent reply Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/17/15 11:23 PM, Sebastiaan Koppe wrote:
 On Saturday, 17 January 2015 at 20:52:28 UTC, Andrei Alexandrescu
 wrote:
 I know I am imposing on somebodies else's work here, but compressing
 resources should really be done.
Our webmaster got back. He said compression is more CPU work and on a fat pipe (which we do have) that may make things actually worse. Also, how would this work if we switch to vibe.d? -- Andrei
If you do not have spare horsepower for compression, how will you handle twice the load?
Not quite getting the logic there. -- Andrei
Jan 17 2015
parent reply "Sebastiaan Koppe" <mail skoppe.eu> writes:
On Sunday, 18 January 2015 at 07:42:10 UTC, Andrei Alexandrescu 
wrote:
 On 1/17/15 11:23 PM, Sebastiaan Koppe wrote:
 On Saturday, 17 January 2015 at 20:52:28 UTC, Andrei 
 Alexandrescu
 wrote:
 Our webmaster got back. He said compression is more CPU work 
 and on a
 fat pipe (which we do have) that may make things actually 
 worse. Also,
 how would this work if we switch to vibe.d? -- Andrei
If you do not have spare horsepower for compression, how will you handle twice the load?
Not quite getting the logic there. -- Andrei
It is unrelated to my point about compression. The reasoning is as follows: if you are maxed out on resources, you will have problems when the site gets more visitors. Compression can still help there. If the file is compressed the server needs to send less bytes, and can close the connection quicker. Pre-compression instead of doing on-demand, like Adam Ruppe said, will optimize it even more. Btw. I build the dlang.org site on my computer but the <script> links have an %0 in the src attribute. Then 5 min later I saw the same on dlang.org Funny thing is, all stuff is still functioning. Affirming my hunch that you can remove a lot of the js stuff. The site now loads in 124kb. Whoever put that %0 there, you just cut down the site from 300kb to 124kb. Nice Job!
Jan 18 2015
next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/18/15 1:01 AM, Sebastiaan Koppe wrote:
 On Sunday, 18 January 2015 at 07:42:10 UTC, Andrei Alexandrescu wrote:
 On 1/17/15 11:23 PM, Sebastiaan Koppe wrote:
 On Saturday, 17 January 2015 at 20:52:28 UTC, Andrei Alexandrescu
 wrote:
 Our webmaster got back. He said compression is more CPU work and on a
 fat pipe (which we do have) that may make things actually worse. Also,
 how would this work if we switch to vibe.d? -- Andrei
If you do not have spare horsepower for compression, how will you handle twice the load?
Not quite getting the logic there. -- Andrei
It is unrelated to my point about compression. The reasoning is as follows: if you are maxed out on resources
We're not maxed out on resources. The question was whether compression adds a net benefit. -- Andrei
Jan 18 2015
prev sibling next sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/18/15 1:01 AM, Sebastiaan Koppe wrote:
 Btw. I build the dlang.org site on my computer but the <script> links
 have an %0 in the src attribute. Then 5 min later I saw the same on
 dlang.org
Urgh, typo. https://github.com/D-Programming-Language/dlang.org/commit/807d471e602c1d8e6eff671ab1fe94b20b2e5cf1 Andrei
Jan 18 2015
prev sibling parent Andrei Alexandrescu <SeeWebsiteForEmail erdani.org> writes:
On 1/18/15 1:01 AM, Sebastiaan Koppe wrote:
 Btw. I build the dlang.org site on my computer but the <script> links
 have an %0 in the src attribute. Then 5 min later I saw the same on
 dlang.org

 Funny thing is, all stuff is still functioning. Affirming my hunch that
 you can remove a lot of the js stuff.

 The site now loads in 124kb. Whoever put that %0 there, you just cut
 down the site from 300kb to 124kb. Nice Job!
Much of that was to enable running D code in the browser, which has been disabled as of late. No problem! I'll ask one of our many lieutenants to look into it :o). Andrei
Jan 18 2015
prev sibling parent "Sebastiaan Koppe" <mail skoppe.eu> writes:
On Saturday, 17 January 2015 at 20:17:51 UTC, Andrei Alexandrescu 
wrote:
 On 1/17/15 12:00 PM, Sebastiaan Koppe wrote:
 On Saturday, 17 January 2015 at 18:23:45 UTC, Andrei 
 Alexandrescu wrote:
 On 1/17/15 10:01 AM, Sebastiaan Koppe wrote:
In the browser. So that on a reload of the page, the browser, instead of making HTTP calls, uses it's cache.
How do we improve that on our side?
2 things: a) Set the proper cache headers in the http response. b) Have a way to bust the cache if you have a new version of an resource. If you have both in-place, you can set the expires header to 1 year in the future. Then bust the cache every time you have a new version of the file.
 Yah, we do a bunch of that stuff on facebook.com. It's 
 significant
 work. Wanna have at it?
Yes. Please. But the compression thing takes precedence.
Awesome. Don't forget you said this.
I won't.
 Design is a *very* touchy issue. It is basically a matter of 
 choice.
 Without a definite choice made, I won't waste my time 
 improving it.
It's clear that once in a while we need to change the design just because it's old. Also, there are a few VERY obvious design improvements that need be done and would be accepted in a heartbeat, but NOBODY is doing them.
If I may suggest, I would split up the site into a couple of sections. One for Introduction/About, one for Docs/Api, one for Blogs, one for Community/Forum. Which is basically what everybody else is doing. Just some random sites: http://facebook.github.io/react/ https://www.dartlang.org/
 I'm not an expert in design but I can tell within a second 
 whether I like one. Yet no PR is coming for improving the 
 design.
Then why not just make a list of sites that we like. And then design this site like those. It is what all the designers are doing.
Jan 17 2015
prev sibling parent Jacob Carlborg <doob me.com> writes:
On 2015-01-17 19:23, Andrei Alexandrescu wrote:
 On 1/17/15 10:01 AM, Sebastiaan Koppe wrote:
 And lets be honest here, why the hell do we even use apache+php and not
 D+vibe.d? I just rewrote my companies corporate website in under 4
 hours. Granted, it is a simple one. But this community should be able to
 rewrite this site in D in under a week, right?
I wish.
Was that honest? I got the impression that you and Walter was satisfied with using Ddoc for the website. -- /Jacob Carlborg
Jan 18 2015