Little surprised there's no thread or main site article on it, since it's being pimped at places like ARS technica... wondering what other developers think of it.
My own take is that this is attempts to waste time throwing good code after bad. It's being touted as a faster PHP engine that runs in a VM, when the question becomes faster than what? Certainly NOT faster than normal PHP if you read the article.
To help make debugging easier, Facebook's engineers developed their own PHP interpreter, HPHPi, that closely matches how PHP code will behave when converted and compiled.
So they're coders have such convoluted messes they need extra tools to help debug... Ok, that's fine in a development environment; certainly not good for deployment.
Former Facebook software engineer Evan Priestly said in a post on Quora that HPHPi is "roughly twice as slow as PHP."
Ok, that's REALLY not good for deployment... so this new engine, how well does it perform?
according to Facebook software engineer Jason Evans—the performance of the HHVM interpreter is already 60 percent faster than that of the HPHPi interpreter
Now, I'm no mathematical genius, but 1/2*1.6=0.8... meaning this new 'faster' engine is 80% the speed of the normal PHP one... This is an improvement?
Most importantly, it's suited for THEIR development environment and debugging methods, meaning it's pretty much useless outside that; but people keep talking about it on sites like Slashdot and OSNews like it's faster than the regular PHP engine and is meant to be used in production environments -- when it's obviously neither...
Of course, if facebook was REALLY concerned about speed and debugging, they'd just cut the blasted code down and trim out the ridiculous page-bloat... as evidenced by the 50k of code to every k of content delivered client-side; making me dread to think the train wreck they have going on server-side. That pretty much makes me think this entire subject is them blaming the tools instead of the developers.
Really they want speed optimizations and lowered hosting overhead, they should look at the code they are running LONG before dicking around making their own PHP bytecode interpreter. (I'm pretty much refusing to call them VM's anymore -- same for Java. Let's cut the BS and call them what they are)
You are right, the code seem to slow their service, they should review it.
I've heard about this, but nothing like the info you just gave. Thanks for this. Definitely opened my eyes to what's really happening.. It's rather interesting they'd try something like this..
To be honest that sort of thing is beyond my ken.
There have been plenty of articles about it recently, don't forget some of the speed comparisons that have been done are between HipHop itself and HHVM and hphpi - various implementations and systems facebook are developing.
Personally, I can see the benefit of having PHP converted, but what is wrong with a bytecode compiler and working there to gain speed? I'm not sure why we need another system that actually converts PHP to something else, why not just write in that something else to start with?
Its one fo those systems that might work well for Facebook, but I can't see it working well for every use, HipHop for example is only useful for single task servers, a virutal hosted environment / shared server would be useless to run it on.
Like any large scale system there are probably many other factors at work here that makes it an adequate solution for Facebook needs. However, I agree in that there are more viable alternatives outside that context for developers of unrelated platforms. I work with systems everyday that make me cringe but they existed way before anything else. Probably similar to Facebook it would be impractical to rebuild everything considering the many many dependencies that exist. So instead of doing that when we have issues we bolt on things to make things "better". I am going to assume that is essentially what facebook is doing instead of completely rebuilding their system. If you don't actually work at company like that you shouldn't be judging their developers competence based on this single thing because there are likely several factors involved not only technical but political as well. Factors that would be impossible to understand unless you work the systems themselves.
Thing is that's for the old translation to C++, which they're kind-of abandoning because a number of things aren't viable in it and/or the overhead actually made it WORSE than flat PHP.
But again, PHP should be glue for system functions and pre-built server calls like SQL; NOT for doing actual processing level stuff... so the 'advantage' of turning it into C++ and compiling it might look good on a synthetic computation benchmark, and then be barely any faster if not possibly slower on real world applications. Part of why PHP has the massive function library in the first place -- since system libraries always beats the tar out of userland code in an interpreted language... (which is why I chuckle at the people who will go through ten to twenty lines of PHP to avoid one regex... because of course that's going to be SO much faster... Regex is slow, it's not THAT slow!)
Or maybe just people getting sick of trying to maintain binaries on deployment. See the general move away from binaries at the application level across the board with bytecode interpreters (Silverlight/Mono/Java) to old-school scripting tied to browser-style engines (see "Metro" and Win8) seeing an upswing in vendor support.