Sunday, January 27, 2008

Books that make you dumb? I don't think so.

Time for another statistics lesson. I'm not the world's greatest statistics whiz (not like the super-geek on TV's "NUMB3RS" show), but that's part of my point: you don't have to be a super-geek to detect major mistakes in statistics that come your way.

This isn't a rant - it's just an interesting example of what to be careful about, with a little entertainment along the way.

I just got an email with something that, on the face of it, is fascinating:
Some one matched up the most popular books in Facebook college groups with average SAT scores at colleges to see what people commonly read at different intelligence levels. http://booksthatmakeyoudumb.virgil.gr/
Nice graphics, and a decent explanation about his method. On the face of it, pretty interesting.

But there's this thing about statistics: you've got to be careful about (at least) three things:
  1. When you see a pattern, are you really seeing a pattern you can count on, or is it just a momentary coincidence? (If the first two people to walk into your office are men, does that mean only men will walk in today?)

  2. Even when you do see a pretty reliable pattern, can you be reasonably sure it means what you think it means? (A relationship between the behaviors of two variables is called a correlation, but that doesn't mean you can say one caused the other. A famous example: for some years there was a correlation between wolverine population and the number of sunspots. Did either cause the other? Not likely, and besides, who could tell? The lesson: Similar behavior of two figures could just be a coincidence.)

  3. Finally, you've got to be really careful about whom you actually measured. (If you interview people who are hanging out in skid row bars at 2 a.m., you may reach some interesting conclusions about the opinions of people in skid row bars at 2 a.m., but you can't say they're conclusions about people in general.)
Returning to the email: this guy saw patterns in which books were favorites at colleges with different average SAT scores. Addressing #1, he correctly didn't count colleges with very little data. But he blew it on #2, when he titled the page "books that make you dumb," revealing a pretty massive fixation on one aspect of the whole picture, and flying in the face of his assurance that "I know correlation doesn't equal causation."

And besides, on #3, he doesn't even mention the gross sampling error of making an assertion about the book, based on data from Facebook readers who read it AND who participate in listing their favorites. Example 1: the Jesuit scholars at Boston College are highly intellectual, and I imagine that if they ranked their favorite books, the Holy Bible would rank high; but I doubt the Jesuits are ranking books on Facebook, and the Bible ranks among the lowest on this guy's charts.

Example 2: if some book actually made many people so brilliant they ditched Facebook, those people would disappear from this ranking entirely, and all that would remain would be the people who completely didn't get it. And, that book would show up as "making people dumb."

Besides, there's the whole issue of whether SATs are any indication of smartness, not to mention which type of smartness (Gardner's Multiple Intelligences).

He woud have been better off titling it BooksThatLowAndHighSATSchoolFacebookMembersLove.

This isn't just an academic issue - these errors can lead us to drive off a cliff. When we think we see something, and we don't, then with the best of intentions we can make serious mistakes in our conclusions, our policy decisions and our life choices.

No comments:

Post a Comment

Your comments will be posted after they are reviewed by the moderator.