Paul M. Jones

Don't listen to the crowd, they say "jump."

Solar 1.0.0 Stable Released

Yesterday, I announced the release of the 1.0.0 stable version of the Solar Framework for PHP on our mailing list. (I tagged the release four days ago on Monday, but wanted to time the announcement to go along with my Solar presentation at ConFoo.)

You can see the change notes here. The highlights are:

  • Added automatic cross-site request forgery (CSRF) protections in various layers of the system.
  • Added support for named actions (aka "named routes") in the front-controller rewrite logic; this is the "bi-directional" routing that some have asked for.
  • Optimized queries for Model::countPages() and the native-by select strategy, so that unnecessary joins against related models are not used when counting the number of pages for the native model results.

The next major steps are to revise and extend the narrative documentation, and of course fix bugs and add features as needed.

Slashdot appears to have gotten to the mailing list announcement before I blogged the release. (The commenters there show the usual range of insight, depth, wisdom, and experience. ;-) The Solar site itself, deployed on a 512M SliceHost VPS instance, appears to be handling the load. However, my Wordpress blog on a separate 512M instance is getting ... a bit ... ... slow. Guess it's time to add wp-super-cache.

This stable release is the culmination of about five years of development effort, with important contributions from several others in the PHP community. My many thanks to everyone who helped make this release, and all the previous releases, better than I could have made it on my own.

(Cross-posted from the Solar blog.)


Climategate Stunner: NASA Heads Knew NASA Data Was Poor, Then Used Data from CRU

Following Climategate, when it became known that raw temperature data for CRU’s “HADCRU3" climate dataset had been destroyed, Phil Jones, CRU’s former director, said the data loss was not important -- because there were other independent climate datasets available.

But the emails reveal that at least three of the four datasets were not independent, that NASA GISS was not considered to be accurate, and that these quality issues were known to both top climate scientists and to the mainstream press.

via Pajamas Media » Climategate Stunner: NASA Heads Knew NASA Data Was Poor, Then Used Data from CRU.


The Indiana Health Care Plan: HSAs

In Indiana's HSA, the state deposits $2,750 per year into an account controlled by the employee, out of which he pays all his health bills. Indiana covers the premium for the plan. The intent is that participants will become more cost-conscious and careful about overpayment or overutilization.

Unused funds in the account--to date some $30 million or about $2,000 per employee and growing fast--are the worker's permanent property. For the very small number of employees (about 6% last year) who use their entire account balance, the state shares further health costs up to an out-of-pocket maximum of $8,000, after which the employee is completely protected.

...

Most important, we are seeing significant changes in behavior, and consequently lower total costs. In 2009, for example, state workers with the HSA visited emergency rooms and physicians 67% less frequently than co-workers with traditional health care. They were much more likely to use generic drugs than those enrolled in the conventional plan, resulting in an average lower cost per prescription of $18. They were admitted to hospitals less than half as frequently as their colleagues. Differences in health status between the groups account for part of this disparity, but consumer decision-making is, we've found, also a major factor.

...

It turns out that, when someone is spending his own money alone for routine expenses, he is far more likely to ask the questions he would ask if purchasing any other good or service: "Is there a generic version of that drug?" "Didn't I take that same test just recently?" "Where can I get the colonoscopy at the best price?"

...

Americans can make sound, thrifty decisions about their own health. If national policy trusted and encouraged them to do so, our skyrocketing health-care costs would decelerate.

via Mitch Daniels: Hoosiers and Health Savings Accounts - WSJ.com.



Solar 1.0.0beta5 Released

This past Friday, I released verion 1.0.0beta5 of the Solar Framework for PHP. You can read the change notes here.

Overall, most of the work was related to the form helpers and making them even more flexible than they were previously. We've also added a new manual chapter on working with models and forms.

It is super-easy to build forms out of model records in Solar. In the controller, once you have a record object, call its newForm() method to get a Solar_Form object. In the view, pass that form object to the form view helper and add a submit-process button:

echo $this->form()
          ->auto($this->form_object)
          ->addProcess('save')
          ->fetch();

Those four lines of code will build a complete form for you based on the model record, including top-level feedback and individual element invaldation messages.

The form helper is smart enough to recognize the column types and validation filters on the model record, and will use the appropriate input types accordingly. For example, booleans get checkboxes, date fields get a series of month/day/year options, and columns using validateInList or validateInKeys become selects.

You can also further customize the form presentation using the fieldset and grouping methods on the form helper. Alternatively, you can the individual form element helpers to build forms by hand.

These features have been present in Solar for years.

Finally, and I'm not making promises, but I think this is the last or next-to-last beta release. I have some tickets about query optimization from the models that I want to complete. Once those are done, I expect to make Solar's first official stable release.

(Cross-posted from the Solar blog at http://solarphp.com/blog/read/63-solar-100beta5-released.)


Global Warming Fraud: The Big Picture

...the IPCC's fundamental conclusions, relating to the allegedly unprecedented warming of the past half-century, are based on bad surface temperature data and are contradicted by more-reliable satellite data and by our knowledge of the earth's climate history. We know for a fact, in short, that the computer models that are the only basis for the AGW theory are wrong: ...

via Power Line - Global Warming Fraud: The Big Picture.


To Badly Go

...to visit DC expecting to find people engaged in serious discussions of economics is like visiting a Star Trek convention expecting to find people engaged in serious discussions of astrophysics.

via To Badly Go.



Running The Symfony 2 Benchmarks

Fabien Potencier released Symfony 2.0.0alpha1 last week, along with some benchmarks showing its performance. I am glad to see that Fabien used my benchmarking system and methodology, and am happy to see that he is paying attention to the performance of his framework. I take this as an acceptance on his part that my methodology is legitimate and valid, and that it has value when comparing framework responsiveness.

However, in attempting to reproduce the published Symfony 2 benchmarking results, I found Fabien's reporting to be inaccurate (or at least incomplete). Read on for a very, very long post detailing my attempt to replicate his results for the "hello world" basic framework overhead comparison, and my conclusions.

For the impatient, here are my conclusions in advance:

  1. Fabien's benchmark report, as shown at http://symfony-reloaded.org/fast, is inaccurate for the setup he describes. Lithium and Flow3 do not work in Fabien's benchmark codebase at Github. Also, Symfony 2 is faster than Solar beta 3 by 5%, not 20%, on a "c1.xlarge" instance; to get a relative difference like Fabien describes, one has to use an "m1.large" instance. (It is entirely possible that the process Fabien used for benchmarking is incompletely described, and that the codebase is not fully updated, thus contributing to this disparity in results.)

  2. We should use Siege 2.69, not 2.66, for more accurate benchmarking of baseline responsiveness. If we notice that HTML is slower than PHP, it's a sign that something is wrong.

  3. Symfony 2 preloads its foundation and application classes, something no other framework does in the benchmarked code. When we treat Solar and Symfony 2 the same way, by preloading the foundation classes for each, we find that Solar is roughly 28% faster than Symfony 2.

Overview

Full disclosure: I am the architect of the Solar Framework for PHP discussed herein. I have been doing these benchmarks for years; see the "benchmarks" tag on this blog.

The primary point of this post is to show that benchmarking is tedious, time-consuming, difficult, and requires a lot of attention to details. I spent two full days doing all the following work, not including the time spent writing this post. It's easy to get things wrong in the benchmarking itself, and it's easy to get things wrong when reporting the results. Transparency, honest dealing, and a commitment to intellectual integrity (what Feynman called "a principle of scientific thought that corresponds to a kind of utter honesty--a kind of leaning over backwards") -- these things are key.

The secondary point of this post is to show that Solar is in fact more-responsive than Symfony 2 when they are treated alike, even under Fabien's test conditions.

Note that this benchmarking series uses the codebase for Fabien's Symfony 2 benchmarks; this is not part of the official web-framework-benchmarks series, as the tested conditions in Fabien's code are somehwat different.

These are the major portions of this post:

  1. We run Fabien's benchmarks using Siege 2.66 on an Amazon ec2 "c1.xlarge" instance using his instructions and codebase. Since Fabien left out the static HTML baseline target, we will add it ourselves for comparison. We will find that his initial report is inaccurate; two frameworks are non-responsive, and the difference between Solar and Symfony is much less than reported.

  2. We will attempt to run the same series using Siege 2.69. We will find that it fails because of socket unavailability.

  3. We backtrack a bit and run the benchmarks on a "m1.large" instance, using Siege 2.66 to ascertain the original scenario.

  4. We run the same series using Siege 2.69, and find the same relative performance ranking as with 2.66, but with lower percent-of-PHP numbers, because Siege 2.69 reports higher (and more believable) baseline numbers.

  5. Finally, we show that Symfony 2 uses a preloaded classes file. When we do the same for Solar and re-run the benchmarks, we find that Solar is more responsive than Symfony 2 by roughly 28%.

Fabien's inital benchmark report is at http://symfony-reloaded.org/fast.

The code he benchmarked against is at http://github.com/fabpot/framework-benchs.

His instructions for reproducing his results are at http://github.com/fabpot/framework-benchs/blob/master/replicating.markdown.

For reference, here are the numbers Fabien initially reported (alphabetized by framework):

framework            |      rel |      avg |        1 |        2 |        3 |        4 |        5
-------------------- | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-php         |   1.0000 |  5465.30 |  4602.06 |  5509.34 |  5694.15 |  6232.73 |  5288.23
cakephp-1.2.6        |   0.0513 |   280.43 |   255.91 |   279.50 |   291.80 |   291.13 |   283.83
flow3-1.0.0alpha7    |   0.0048 |    26.29 |    23.87 |    26.97 |    26.67 |    26.93 |    27.02
lithium-0.6          |   0.2128 |  1163.27 |  1059.44 |  1179.42 |  1180.52 |  1197.73 |  1199.25
solar-1.0.0beta3     |   0.2825 |  1544.14 |  1293.81 |  1596.28 |  1601.55 |  1613.20 |  1615.86
symfony-1.4.2        |   0.1737 |   949.59 |   916.84 |   944.49 |   953.88 |   967.52 |   965.24
symfony-2.0.0alpha1  |   0.3312 |  1810.07 |  1693.15 |  1846.41 |  1827.51 |  1856.98 |  1826.30
yii-1.1.1            |   0.1901 |  1038.77 |  1033.20 |  1037.60 |  1038.47 |  1041.57 |  1043.01
zend-1.10            |   0.0906 |   494.90 |   320.74 |   519.74 |   537.15 |   546.11 |   550.76

Finally, for those who wish to follow along, the scripts for each of the major sections of this post are available here: http://paul-m-jones/public/fabiens-benches.sh


Part 1

We set up a "c1.xlarge" instance per the instructions from Fabien, and run siege.php against the targets file.

Flow3 had an exception:

<h1>500 Internal Server Error</h1>
<p>FLOW3 experienced an internal error (uncaught exception):</p>
<p>PDOException</p>

Looks like PDO has to be loaded for Flow3.

Lithium had an error too:

<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>404 Not Found</title>
</head><body>
<h1>Not Found</h1>
<p>The requested URL /lithium-0.6/app/webroot/hello/Fabien was not found on this server.</p>
<hr>
<address>Apache/2.2.9 (Debian) PHP/5.3.1-0.dotdeb.1 with Suhosin-Patch Server at localhost Port 80</address>
</body></html>

This is an Apache 404 error; the target line for Lithium appears to be wrong.

Therefore, we can ignore those two frameworks in our results. The report looks like this:

framework                |      rel |      avg |        1 |        2 |        3 |        4 |        5
------------------------ | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-html            |   0.9002 |  5239.91 |  4438.38 |  5458.72 |  5822.69 |  5230.76 |  5249.02
baseline-php             |   1.0000 |  5821.08 |  4950.66 |  5763.70 |  5729.34 |  5939.63 |  6722.07
cakephp-1.2.6            |   0.1022 |   594.87 |   568.08 |   597.78 |   603.13 |   603.88 |   601.48
flow3-1.0.0alpha7 *      |   0.0008 |     4.40 |     3.41 |     4.55 |     4.64 |     4.72 |     4.70
lithium-0.6 *            |   1.0119 |  5890.55 |  5436.60 |  5326.39 |  6037.85 |  5996.23 |  6655.69
solar-1.0.0beta3         |   0.2441 |  1420.88 |  1323.49 |  1404.17 |  1416.07 |  1483.37 |  1477.30
symfony-1.4.2            |   0.0876 |   509.73 |   493.31 |   506.76 |   517.65 |   514.61 |   516.33
symfony-2.0.0alpha1      |   0.2573 |  1497.54 |  1242.21 |  1433.91 |  1607.85 |  1626.23 |  1577.50
yii-1.1.1                |   0.1360 |   791.69 |   790.61 |   787.22 |   795.12 |   793.43 |   792.08
zend-1.10                |   0.0769 |   447.62 |   382.17 |   448.55 |   473.18 |   471.33 |   462.88

(* ignore)

Contrary to Fabien's report, we see that Symfony 2 is not "20% faster" than Solar. Symfony 2 at .2573, and Solar at .2441, is more like a 5% difference, with Symfony 2 in the lead.

However, the baseline PHP response was 10% faster than a static HTML page (where the PHP engine is not invoked at all). This indicates something is wrong with the benchmarking environment. I saw similar behavior when using ab (the Apache benchmark tool) and it looks like Siege 2.66 has the same erroneous behavior. Let's switch to the more-recent Siege version 2.69 and see if we can eliminate that.

Part 2

We remove Siege 2.66, install Siege 2.69, and attempt to run the benchmarks again.

The problem is, even with the ulimit set as high as it is, Siege 2.69 floods the server and we get tons of "error: socket: 1168148816 address is unavailable.: Cannot assign requested address" errors.

We can set 'failures' => 1048576 in siege.php (i.e., the same as the ulimit value) to try and ignore the socket errors. However, even at that high value, we still can't get through the baseline html response; socket availability is still too low.

As such, we will step down to an "m1.large" instance for the remainder of the process. I know from previous experience that Siege will not exceed the socket availability on this kind of instance.

Part 3

We terminate the "c1.xlarge" instance and run an "m1.large" instance in its place. Now that we're on a new instance, we need to re-run Fabien's benchmark series using Siege 2.66 again to make sure the errors we received before are not instance-type specific. When we do, we get these results (Flow3 and Lithium show the same errors as before):

framework                |      rel |      avg |        1 |        2 |        3 |        4 |        5
------------------------ | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-html            |   0.9658 |  2424.05 |  2430.61 |  2452.35 |  2357.62 |  2431.77 |  2447.89
baseline-php             |   1.0000 |  2509.98 |  2548.19 |  2517.85 |  2509.79 |  2439.35 |  2534.70
cakephp-1.2.6            |   0.0748 |   187.65 |   187.34 |   188.88 |   187.83 |   187.62 |   186.57
flow3-1.0.0alpha7 *      |   0.0004 |     1.07 |     1.12 |     1.02 |     1.04 |     1.11 |     1.06
lithium-0.6 *            |   1.0639 |  2670.45 |  2653.04 |  2714.18 |  2659.09 |  2662.56 |  2663.39
solar-1.0.0beta3         |   0.1944 |   487.88 |   486.83 |   478.97 |   493.04 |   489.55 |   491.03
symfony-1.4.2            |   0.0810 |   203.22 |   204.22 |   204.79 |   204.08 |   201.49 |   201.53
symfony-2.0.0alpha1      |   0.2330 |   584.73 |   582.53 |   578.17 |   589.57 |   588.38 |   584.98
yii-1.1.1                |   0.1463 |   367.16 |   362.83 |   373.34 |   362.67 |   377.57 |   359.41
zend-1.10                |   0.0542 |   135.99 |   135.06 |   135.28 |   137.09 |   136.67 |   135.87

(* ignore)

Now we see a difference in the Solar and Symfony 2 numbers that looks like Fabien's original reporting; Symfony 2 at 0.2330 is about 20% faster than Solar at 0.1944.

But we still see the same error condition of PHP looking like it runs faster than static HTML. Let's move away from Siege 2.66 and try Siege 2.69 on this smaller instance.

Part 4

We remove Siege 2.66, install Siege 2.69, and re-run the new Siege against the same targets on the same "m1.large" instance.

framework                |      rel |      avg |        1 |        2 |        3 |        4 |        5
------------------------ | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-html            |   1.1710 |  5594.40 |  5732.63 |  5610.27 |  5769.64 |  5663.68 |  5195.80
baseline-php             |   1.0000 |  4777.65 |  4853.58 |  4729.50 |  4721.51 |  4772.37 |  4811.31
cakephp-1.2.6            |   0.0401 |   191.71 |   186.32 |   192.19 |   190.43 |   194.06 |   195.53
flow3-1.0.0alpha7 *      |   0.0000 |     0.00 |     0.00 |     0.00 |     0.00 |     0.00 |     0.00
lithium-0.6 *            |   1.0689 |  5107.01 |  5196.32 |  5158.93 |  4888.00 |  5092.78 |  5199.03
solar-1.0.0beta3         |   0.1126 |   537.85 |   541.37 |   537.44 |   537.34 |   536.59 |   536.52
symfony-1.4.2            |   0.0443 |   211.73 |   212.29 |   211.68 |   213.56 |   210.47 |   210.64
symfony-2.0.0alpha1      |   0.1370 |   654.65 |   655.15 |   654.62 |   653.44 |   651.52 |   658.52
yii-1.1.1                |   0.0819 |   391.44 |   396.89 |   389.75 |   390.60 |   396.30 |   383.68
zend-1.10                |   0.0293 |   140.19 |   139.83 |   138.85 |   141.18 |   140.87 |   140.24

(* ignore)

This looks more like what we should be seeing: HTML is now faster than PHP. The rankings and relative ratings appear similar to the Siege 2.66 run; Symfony 2 at .1370 is about 20% faster than Solar at .1126.

Part 5

I spent some time picking apart Symfony 2 to see what it might be doing that Solar could use for improvement. One reason for Symfony's performance is that (in the benchmarked code) all the Symfony 2 foundation classes are concatenated into a single "bootstrap.php" file. Similarly, Symfony 2 caches its application classes into another single file (hello/cache/prod/classes.php). From what I can tell, none of the other frameworks are doing anything like that; they are reading class files individually as needed.

Let's see if we can even out the Solar vs. Symfony 2 playing field. In this final benchmarking pass, we get the latest trunk code of Solar, compile the Solar foundation classes into a preload file just like Symfony's, and use that preload file in the Solar bootstrap. Then we'll target just Symfony 2, Solar beta 3 (non-preload), and the Solar trunk preload for comparison to each other. The results on the "m1.large" instance are:

framework                |      rel |      avg |        1 |        2 |        3 |        4 |        5
------------------------ | -------- | -------- | -------- | -------- | -------- | -------- | --------
baseline-html            |   1.1884 |  5611.92 |  5428.74 |  5700.62 |  5654.88 |  5622.77 |  5652.61
baseline-php             |   1.0000 |  4722.21 |  4646.16 |  4698.98 |  4780.01 |  4753.59 |  4732.29
solar-1.0.0beta3         |   0.1142 |   539.31 |   542.25 |   543.45 |   538.97 |   537.07 |   534.82
solar-preload            |   0.1780 |   840.38 |   841.38 |   846.63 |   838.32 |   833.13 |   842.44
symfony-2.0.0alpha1      |   0.1384 |   653.52 |   658.03 |   658.61 |   654.93 |   649.51 |   646.50

It appears that when we treat Solar and Symfony 2 the same way, by preloading the foundation classes, we find that Solar is about 28% faster than Symfony 2 (and about 55% faster than the non-preload Solar beta 3 with no code changes at all). Perhaps it would be wise for Solar to provide a something like a preload.php file of its own as part of the system distribution.

Conclusion

  1. Fabien's benchmark report, as shown at http://symfony-reloaded.org/fast, is inaccurate for the setup he describes. Lithium and Flow3 do not work in Fabien's benchmark codebase at Github. Also, Symfony 2 is faster than Solar beta 3 by 5%, not 20%, on a "c1.xlarge" instance; to get a relative difference like Fabien describes, one has to use an "m1.large" instance. (It is entirely possible that the process Fabien used for benchmarking is incompletely described, and that the codebase is not fully updated, thus contributing to this disparity in results.)

  2. We should use Siege 2.69, not 2.66, for more accurate benchmarking of baseline responsiveness. If we notice that HTML is slower than PHP, it's a sign that something is wrong.

  3. Symfony 2 preloads its foundation and application classes, something no other framework does in the benchmarked code. When we treat Solar and Symfony 2 the same way, by preloading the foundation classes for each, we find that Solar is roughly 28% faster than Symfony 2.


The Green Death

Who is the worst killer in the long, ugly history of war and extermination? Hitler? Stalin? Pol Pot? Not even close. A single book called Silent Spring killed far more people than all those fiends put together.

...

The motivation behind Silent Spring, the suppression of nuclear power, the global-warming scam, and other outbreaks of environmentalist lunacy is the worship of centralized power and authority. The author, Rachel Carson, didn’t set out to kill sixty million people – she was a fanatical believer in the newly formed religion of radical environmentalism, whose body count comes from callousness, rather than blood thirst. The core belief of the environmental religion is the fundamental uncleanliness of human beings. All forms of human activity are bad for the environment…

via The Greenroom » Forum Archive » The Green Death.