A comment from my post last week about the need for factoring led me to re-visit a question I have posed to classes before, but never allowed to move beyond the “gee, that’s interesting” stage.
Given a polynomial in standard form, with random non-zero* integer parameters a, b and c, what is the probability that the polynomial will factor?
I’ve pursued this question with classes before by writing a polynomial on the board, with blanks or boxes in the a-b-c positions. Sometimes, I would take “random” shout-outs from the class to fill in the boxes. With another class, the randint function on a TI calculator was used to generate our abc’s. The point was to demonstrate that a large majority of quadratics are not factorable, and that despite the nice, rigged, problems we encounter in textbooks, we should spend far more time considering what to do with the messy ones. But I’d never put pencil to paper and thought about the theoretical probability.
After my post on factoring last week, Jim Doherty mentioned a speaker he had encountered find an experimental probability that a quadratic would factor, and cited 7%. That number seemed reasonable to me, but perhaps a bit on the high side. I set up an Excel document to generate three non-zero integers (more on this later), and rigged a system to check for perfect-square discriminants. I recorded experimental results, in groups of 1000 trials, and kept a running total.
After 25,000 trials, I found that 7.26% of the quadratics would factor.
*While this endeavor started off innocently and quickly enough, I had to start over after I realized my Excel document allowed for zeroes. It took a little logical Excel rigging to exclude them.
So, there must be a theoretical probability out there someplace? Anyone know how to do it?