OK - sorry if this is the wrong thread to post in - not sure what the right one would be.
We're building a site out of .php includes - do we have to manually minify each file or is there something that can minify the whole site on the fly or in one go? Does that make sense?
Not sure what you mean by "We're building a site out of .php includes"
I'm not sure of the PHP solutions available but I'll explain how I work.
For CSS I use Compass which uses Sass for includes. Compass will compile and minify these for you.
//= require jquery
//= require vendor/prism
There's a lot different compression tools available, sprockets uses uglify by default.
</link rel="stylesheet" href="css/application.css">
The other option for js is dynamically load the scripts depending on the content, e.g. http://requirejs.org/
This option is becoming more popular but I like the idea of just having one file - unless your application has massive amounts of js.
It can be the right way to go if there's some large libraries that are only needed for a couple of pages. Things like charting libraries could be good to put into their own js like analytics.js and include it dynamically.
Hope it helps
Yeah sorry, that was a woolly description at best - we're building the site in HTML5. The page are then ripped apart and put back together out of .php templates and includes.
Thanks for the help - I'll look into you solution.
Just out of curiosity, what do you mean by "Google page speed"? I've never come across the term before.
I've never bothered to minify my pages, because I'm not convinced it does all that much good. Then again, as I have no experience of it, I might be totally wrong about that. Bootfit, I'd be interested to hear how you get on with it, and whether you do in fact improve your "Google page speed" - or any other performance improvement.
Here you go - https://developers.google.com/speed/pagespeed/insights
minifying(removing white space) is one of the lesser optimisations. Combining files though can make a considerable difference to load times, keeping the number of HTTP requests down is the #1 rule. You could find a lot of before / after timed tests on the net.
I'm sure you've probably seen this by now but it's always good to reference in these threads.
I think what you're really saying is that you want your pages to load faster?
You said "minifying(removing white space) is one of the lesser optimisations". Yes, that's what I thought as well. As you say, there are lots of other ways of optimising a page that would bring better results.
Just for fun, I might try it one day, and try to measure the result, but I don't see it as a high priority.
Well yeah - they already load pretty quick and are optimised to a high degree but as the site I'm working on is a personal project that I have the time to dedicate to, I'm trying to get as high a score possible on Google page speed. I know the big G places emphasis on page speed for it's SERPs so anything I can learn from this can only benefit me in the future.
Is the server set to send those file types in a compressed format? If not then turning that on (if you have access to do so) will make a bigger difference than minifying.
From the sound of "Page Speed", Google is looking to shoehorn its way into the web hosting market.
PageSpeed Service fetches content from your servers, rewrites your pages by applying web performance best practices and serves them to end users via Google's servers across the globe.
PageSpeed Service is currently offered free of charge to a limited set of webmasters during this trial period. Pricing will be competitive and details will be made available later. You can request access to the service by filling this web form.
Now we know why Google started using loading speed as a ranking criteria, huh?
This whole things sounds plain stupid. If you are on a decent web host, you are not going to save any time, and if you do it won't be much to even justify this silliness.
You say that you are manually minifying your files.
I came accross a script that will minify your pages (HTML - inline css/js)on the fly during output.
You can check that out here:- http://codecanyon.net/item/dynamic-website-compressor/2838376
Well I never used that. But may help you.
According to Google Page Speed minifying is considered Low Priority. (Don't sweat the petty things and only pet the...)
I use a PHP Framework that fortunately combines all included files into a single string $view.
I also have a $_SESSION variable which toggles compression so the uncompressed $view is easier to DEBUG
Anyway I adapted this Method to suit my requirements.
Method to Maybe Compress HTML Source
// $this->_maybe_compress_using_reference( $view );
function _maybe_compress_using_reference( & $result='', $return_result=false )
if ( $_SESSION['jjj']['compress_result'] )
$_SESSION['zzz_after'] = ''; // strlen($result);
$_SESSION['zzz_before'] = strlen($result);
$search = array
'/\\>[^\\S ]+/s', //strip whitespaces after tags, except space
'/[^\\S ]+\\</s', //strip whitespaces before tags, except space
'/(\\s)+/s' // shorten multiple whitespace sequences
$replace = array
$result = preg_replace($search, $replace, $result);
// DOES NOT SHOW ADVERTS ?????
// Optional Cosmetic Stuff to show results at the end of the $view
$_SESSION['zzz_after'] = strlen($result);
# Dab on end
if(LOCALHOST) // DABS this on the end after </body?</html>
$result = str_replace("</body></html>",'', $result);
$result .= '<p class=\\'clb\\'> Crunched output results:'
. jj.js .'Before: ' . number_format($_SESSION['zzz_before'])
. jj.js .'After: '. number_format($_SESSION['zzz_after'])
. jj.js .'Saving: ' . number_format($_SESSION['zzz_before'] - $_SESSION['zzz_after']) .' bytes'
. jj.js .'Percent: '
. number_format(100 * ($_SESSION['zzz_before'] - $_SESSION['zzz_after']) / $_SESSION['zzz_before']) .' %'
#RETURNED BY REFERENCE
This topic is now closed. New replies are no longer allowed.