Profiling my scripts, the biggest drain on performance is the autoloader, specifically the require/include line.
A current site includes 38 files.
I did a quick test using include/require/require once/include once. Profiling just the autoload function and changing only the include/require line here's what I got just as a very basic test.
All very close. Nothing really in it. However, 38 calls to include/require are using 70% of my script's CPU time so it's a hefty chunk of resources and easily the most resource intensive function I have.
Now that still seemed a lot to me but perhaps it just a file I/O bottleneck. As a basic test, I tried:
eval('?>' . file_get_contents($file));
The result... 24.29ms roughly five times faster. This is rather astonishing. obviously eval is bad™ but this proves that include/require are doing a hell of a lot more than simply loading the file. By using eval(file_get_contents()) I've improved my entire script performance by almost 100%.
Any idea why include should be so comparatively slow?
edit: the files being included are doing no real processing on include, they are all just class definitions.
What version you running there Tom
5.3. However, I'm now wondering whether the profiler itself has something to do with it. I'm using Zend Studio 9 to profile the code. I wonder if the profiler has to modify the include function in some manner.
Have you tried using Pingdom.com to check the speed of your site? I find that the PHP optimisation makes very little difference and the biggest time gobbler is the waiting for the server to respond.