Lies, damned lies and statistics

The city of Toronto has just unveiled a website where parents can see the ratings of licensed, publicly-subsidised child care centres and agencies.  A thoroughly excellent idea, although it would be better still if it were not limited to just the centres that have a subsidy service contract.  Sadly the entire exercise has been sabotaged by the rating system Toronto bureaucrats have chosen.

Unfortunately the city has applied a conceptual "bell curve" to the 4-point rating system: the minimum passing grade is 3, and the minimum possible grade is 1.  In a brief look through the system, I could not even identify any facilities with a 1 or 2 rating.  According to the Toronto Star, "96 per cent of centres received an average score of at least 3 last year."  And if you drill down into the specifics, the citywide average for all facilities is about 3.6.  To find the really outstanding facilities, you’re looking for 3.7 to 4.0.

Well, that’s wonderful.  But I’m not interested in minimum code compliance, I’m interested in knowing, at a glance, the facility’s level of excellence.  If you hearken back to college or university days, the theoretical "average" GPA (in a 4-point system) is 2.0, but in practice it tends to be 3.0 because instructors so rarely hand out low grades to richly deserving students.  Most of academia considers 2-point-something (with no failed courses) to be the minimum non-probationary "passing grade".

In terms of child care ratings, you might think that a 1- or 2-point facility might be closed, but you’d be wrong.  Child care spaces are at such a premium that it is critical that all available facilities stay open.

[Toronto’s general manager of children’s services, Brenda] Patterson said the city would be reluctant to close a centre immediately because of the chaos it would create for parents. Instead, the city works with the centre to improve it and conducts more inspections.

— Vanessa Lu, "City daycare ratings go public".  Toronto Star, January 23, 2008.

In others words the crappy ratings are somewhat pointless, because demand far outstrips supply.  To transpose the analogy elsewhere, even the rat-infested restaurants must serve food, because there’s not enough to go around.

So what we’ve got is a 4-point system in which the majority of the scale is devoted to degrees of failure, while only 4% of the rated facilities actually received a failing grade.  The parent looking for above-average and exceptional facilities must parse the tiny range of 3.7 to 4.0 (if indeed any facility has received top marks in all areas).

This is not a rating system devoted to measuring excellence, it is a feel-good exercise in mediocrity.

You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.
2 Responses
  1. DCardno says:

    It reminds me of wine ratings; you literally never see a wine rated less than ~81 or so on what is supposedly a 100-point scale – and let me tell you, it’s notbecause there are only varying degrees of excellence on the shelf. In fact, wine is rated on roughly a 15-20 point scale (depending on the reviewer), to which they add 77-82 points, such that their ‘perfect’ wine would score 97. The last three points are held in reserve, so the wine expert / reviewer can sniff disdainfully at someone who enthuses about an excellent $150 bottle and say, “I only gave it a 97.”
    In both systems, there is much intended to benefit the producer, some side attraction for the reviewer / monitor, and no intention to provide information to the ultimate consumer, unless by mistake.

  2. Chris Taylor says:

    You may just have destroyed whatever small joy I get from perusing the Vintages section at the LCBO.
    On the other hand, that is pretty damn useful information.