Welcome, my friends, to another episode of BASAL: Ben’s Amateurish Synthesis of the Academic Literature!
On this episode, we’ll examine:
- How historical mathematicians struggled to conceive of algebra
- How historical mathematicians struggled to conceive of probability
- How learning mathematics is basically a speed-run of those same historical challenges
- Who cares, and why
1.
Helena Pycior and the Weirdness of Symbols
Per a hot tip from Jim Propp (try saying that ten times fast), I’ve been enjoying the work of historian Helena Pycior, and her writing about the 19th-century turn toward “symbolical algebra.”
This math, now taught under the auspices of “Algebra 1,” isn’t so strange or unfamiliar to us (where by “us” I mean “readers of math blogs”). But Pycior reveals its utter weirdness.
It’s not just 13-year-olds who find this stuff hard to swallow. As late as the mid-1800s, some British mathematicians had to be dragged kicking and screaming into this nightmarish world where symbols float free from the things they symbolize.
Take this quote from William Hamilton, best known for dreaming up the quaternions, in a letter to a colleague:
We belong to opposite poles of algebra; since you… seem to consider Algebra as a ‘System of Signs and of their combinations,’ somewhat analogous to syllogisms expressed in letters; while I am never satisfied unless I think that I can look beyond or through the signs to the things signified.
You catch that? William Hamilton, who gave us our bizarro non-commutative generalization of the complex numbers, didn’t want algebra to be a science of symbols alone.
He wanted to know, in precise terms, what the symbols symbolized.
No one objected to a more modest form of algebra that you might call “generalized arithmetic.” A statement such as 2x+3x=5x can be read as a pithy summary of a concrete arithmetical pattern: any number’s double, plus that number’s triple, gives that number’s quintuple.
This kind of algebra was unanimously accepted.
But stuff like multiplying two negatives to get a positive… well, that wasn’t arithmetic anymore. It was pure symbolism, with no clear thing being symbolized. It worked, in the sense that it gave consistent results, and proved useful in addressing mathematical and scientific problems… but it didn’t make actual, concrete sense.
To embrace “a negative times a negative is a positive” is to lose your mathematical innocence. It is to give up on symbols having clear and concrete real-world meanings. It is to adopt a distinctly 19th- and 20th-century vision of mathematics as a self-consistent logical system. It is, as Hilbert put it, to accept math as “a game played according to certain simple rules with meaningless marks on paper.”
In brief: Algebra isn’t just hard for surly teens. It’s hard for anyone with an intellectual commitment to symbols having meanings.
2.
Ian Hacking and the Emergence of Probability
Someone on Twitter joked (was it a joke?) that they require their Intro Stats students to read Ian Hacking’s The Emergence of Probability. It seems a rather heavy lift, given how challenging students find it to read, say, the syllabus. Still, I was intrigued.
And the book is oddly gripping. I learned that the emergence of probability was…
- Abrupt. It happened very distinctly between 1650 and 1670.
- Widespread. Dozens of thinkers took up the same themes at the same time.
- Surprisingly late. People have been gambling for millennia. Obviously, 1660 was not the first time someone wanted to win at gambling. So why the long wait for probability?
Clearly, probability emerged because, in the mid-1600s, something was “in the air.” Antecedents like Cardano only prove the point: he laid out the beginnings of probability a century earlier, and nobody picked up the thread. What better evidence that folks weren’t ready for probability than the fact that someone texted it to them, and they left it on “unread” for years?
Now, I’m liable to mangle Hacking’s delicate argument, but in short, he resolves the mystery by analyzing forms of knowledge. By 1600, there were two basic forms of knowledge.
First (and slightly more recognizable to us) was certain knowledge: stuff that we know beyond all doubt because it had been proven, Euclid-style, via irrefutable demonstration. This included not just math, but also astronomy, optics, and the like. (Today, we tend to view this category as virtually empty, because we view empirical truths as fundamentally contingent and uncertain. But that’s because we see through post-probability eyes.)
The second (and more alien) kind of knowledge was testimony or opinion: stuff we know because an authority declared it true.
This is where the word “probable” originates. And its meaning was not what we think.
Back then, probability referred to the esteem in which we hold the testifying authority. It meant something like “worthiness of approval.” A “probable” fact came from someone respectable like Livy or Polybius; an “improbable” one from some nameless scribe. Hacking quotes phrases like “this account is highly probable, but known to be false”–which sounds paradoxical to our ears, but was a perfectly sensible thing to say at the time.
Anyway, notably missing in this dichotomy: the idea of physical evidence.
Clouds as evidence of a storm. A cough as evidence of a fever. Snowy footprints as evidence of a nearby rabbit. What to make of these kinds of signs — not quite causes, because they do not guarantee an effect, but only hint at it with varying degrees of strength? Such signs were wedged into the second category of knowledge, classified as “the testimony of nature.” According to Hacking (and here he sort of loses me), this wasn’t a metaphor: people literally thought of these as testimony, accounts authored by nature.
It’s from this second kind of knowledge that probability emerged.
I’ll write more about this soon. (The book deserves a sprawling 5000-word ACX-style review.) But one takeaway is this: the way we conceive of probability and statistics is laden with a tremendous amount of baggage. Just as miles of invisible atmosphere are always weighing down on our heads, so do miles of forgotten convictions and lost theologies weigh down on our blithe claims that “the probability of heads is 50%” or “the probability of [redacted] winning the election is [redacted].”
Even more briefly: Probability isn’t just hard for surly tweens. Such math is inescapably tangled with your entire worldview.
3.
Anna Sfard and the Miracle of a How Becoming a What
The polymathic Michael Pershan, who has read everyone and everything, suggested years ago that I check out Sfard’s “On the Dual Nature of Mathematical Conceptions.” Fool that I am, it took me until 2024.
Aggressively condensed, her argument is this. We first learn mathematics as a process. Later, in a mysterious and quasi-miraculous stroke of insight, we reinterpret the process as a structure, an object in its own right.
Take the very beginnings of school math: counting. Show a preschooler 5 objects, and they’ll count them: “one, two, three, four, five.” Then, add another object, and say, “How many now?”
Most kids won’t build from five. They’ll start all over again: “one, two, three, four, five, six.” For them, “five” makes sense only as part of the process of counting. It is not yet an object in its own right. Not yet what we’d call a number.
To do arithmetic — say, adding five and five — you need to take the outcome of this counting process, and begin treating it as an object in itself, a move that Sfard calls reification.
This learning cycle doesn’t just hold for counting. The process of division (4 pizzas divided among 7 people) gives rise to the object we call “fractions” (4/7, which is the result of that division process, but also an object in its own right).
The processes of calculation (double, then add five) give rise to algebraic expressions (2x+5, which is the outcome of the calculation process, but also an object in its own right) and eventually to functions.
Functions, by the way, took centuries to pin down. The definition I’ve taught to 16-year-olds — under which a function is a set of ordered pairs (x,y), with x in the domain and y in the range, and each x appearing in precisely one ordered pair — is a baroque triumph of reification, an artifact of Bourbaki and the 20th century.
Which brings us to Sfard’s key point: reification is both a pedagogical process and a historical one. It’s like the developmental biologists used to say: ontogeny recapitulates phylogeny.
The struggles of historical mathematicians are a preview of what our students go through.
4.
Two Contradictory Thoughts on the History of Mathematics
Everyone seems to feel that math class is missing a human element. What’s the deal with these rules? Who came up with this stuff? Why are we doing any of this? In the search for meaning, it’s common to invoke history. If we could just explain who came up with this stuff, and why, then maybe it would give the subject a human face, a meaningful context.
That is Thought #1: the hope that history can rescue math from its obscurity and abstraction.
Thought #2 is that this doesn’t really work.
I speak from experience, in the same way Charlie Brown speaks from experience about kicking footballs. For example, in the early drafts of the book that would become Change is the Only Constant, I narrated a lot of the history of calculus, thinking this was very clever and engaging.
“Ummm….” my editor Becky said. “This is a lot of history.”
This was her polite way of saying, “Why, Ben, why?!!”
People bored and alienated by math may think they want history. But history doesn’t simplify matters. It complicates them enormously. As you trace your ancestors back through the generations, your family tree grows exponentially. So too with ideas. Lineages multiply. The past may explain the present, but good luck explaining the past.
When people say they want “history,” what they really want is story. They want anecdotes. It’s not necessary that they be true, and it’s not desirable that they be historically rigorous. Genius myths go down smooth; messy contingencies, not so much.
The value of history, then, is not for learners.
History’s value is for teachers.
Math education is full of deep and wonderful and problematic ideas. They were born from the tumult of centuries, from surprising collaborations and knock-down, drag-out fights. Our notations, our concepts, our pedagogical sequence — these were not, by and large, inevitable. Look back at the path behind us, and you will find defeated rivals, failed competitors, forgotten alternatives. Like everything else in the world today, the subject we call “mathematics” bears the marks and scars of the millennia that produced it. To know what it is — and to know where it’s going in the decades to come — you’ll need a rich knowledge of where it came from.
History cannot rescue math from obscurity. History cannot convince students to embrace the idea that two negatives make a positive. History cannot turn rigid, dogmatic thinkers into nimble, probabilistic ones.
But teachers who know the history, alongside knowing the math, and knowing the students — well, maybe they’ve got a shot.