Danehy

Tom Horne needs to learn: Math is invaluable to everyone

If you took a random set of numbers--for example, the populations of all of the 3,000-plus counties in the United States--your built-in logic will tell you that approximately 11 percent (roughly one out of nine) will start with the number one; 11 percent will start with two, and so on.

Unfortunately for built-in logic, the number one appears as the first digit a puzzling 32 percent of the time. The number two appears 17 percent of the time, while nine appears less than 5 percent of the time.

The same thing will happen if you look up death tolls from major earthquakes or sets of logarithms. Still, when told of the odd distribution of percentages, one's logic will kick in again and think, "Well, obviously, if you count from one to 10, two of the 10 numbers start with 1. And if you go to 19, then 11 of the 19 numbers--more than half--start with 1." Yes, but if you keep going to 99, it's still only 11 numbers that start with 1, and that's one out of nine, so we're back where we started.

This is known as Benford's law, and nobody is quite sure why it works. It was actually first discovered by a guy named Simon Newcomb in 1881 when he noticed that the pages of logarithm books where the numbers started with 1 or 2 were dirtier than any of the other pages, meaning they were handled more. Newcomb did some digging, found that the phenomenon happens all through nature and actually came up with a formula assigning values to the probability of all nine digits. (Nothing, or everything, starts with a zero.)

For some reason, no one took notice of Newcomb's work, and it faded into history. Sixty years later, physicist Frank Benford stumbled across the same phenomenon (apparently independently), and his name stuck with the law. It's like what Jimmy Durante used to say when people would bring up the Vikings in any discussion of who "discovered" America: "When Columbus discovered America, it stayed discovered!"

Benford discovered numerous situations where the phenomenon popped up, and it was almost always in the same distribution: 30 percent for 1, 17.6 percent for 2, 12.5 percent for 3, on down to 4.6 percent for 9. As a matter of fact, if you take the first 2,000 of Fibonacci numbers (the numbers generated by adding the two previous numbers, as in 1,1,2,3,5,8,13...), 30 percent start with the digit 1; 17.6 percent start with two, and right on down the line.

The foremost expert on Benford's law is a guy named Ted Hill. He proved, among other things, that while there are some sets of randoms for which it doesn't work, the more numbers you add to a sample, the more likely the sample will conform to the law. He started working on it as a lark and almost quit when friends warned him that it could be totally addictive.

I know the feeling. I was first introduced to Benford's law by my high school calculus teacher, a man who had a doctorate in math and taught at a ghetto high school out of a sense of responsibility to his community. Benford's law has been bugging me, on and off, ever since. While I'm pretty certain that I never will, I have this driving need to try to understand it and to explain why it happens.

I thought about this the other day when I read that Arizona Superintendent of Public Instruction Tom Horne is arguing against having our state's high school students take four years of math. His reason: A study that says that there is a correlation between math requirements and the dropout rate. Horne cites an article that first appeared in Education Week that stated: "Researchers from the United Negro College Fund asked high school dropouts why they dropped out. With surprising consistency, a majority of participants said math."

Horne claimed that several national studies show that increasing the math requirement significantly raises the dropout rate. This may or may not be true. Does it mean that if there were no math requirements, we would have no dropouts?

I certainly don't expect every kid to be an "Oh, wow!" geek like myself or a Mathlete like in Mean Girls, but neither do I want to nurse them through school by doling out the math with an eyedropper. Math is everywhere. Math is important. We shouldn't even be debating whether high school kids should be taking math every year.

One thing we all share is that we all hit a math wall sooner or later. Some hit it in algebra, some in geometry. I hit it in multi-variable differential equations, which sometimes look like somebody threw up in Sanskrit. (Even the Fields Medal winner in Good Will Hunting didn't understand some of Matt Damon's stuff.)

That doesn't mean that each kid should take math until he/she hits a wall and then go do something else. There are all kinds of different math that kids can take--statistics, business math, accounting. Taking as much as possible will guarantee that they will be able to function in an increasingly technological world and keep them from sinking into a knucklehead underclass.

Math is your friend and your kids' friend. I'm just not so sure that Tom Horne is.