# Algebra of Probable Inference By Richard T. Cox

In *Algebra of Probable Inference*, Richard T. Cox develops and demonstrates that probability theory is the only theory of inductive inference that abides by logical consistency. Cox does so through a functional derivation of probability theory as the unique extension of Boolean Algebra thereby establishing, for the first time, the legitimacy of probability theory as formalized by Laplace in the 18th century.

Perhaps the most significant consequence of Cox's work is that probability represents a subjective degree of plausible belief relative to a particular system but is a theory that applies universally and objectively across any system making inferences based on an incomplete state of knowledge. Cox goes well beyond this amazing conceptual advancement, however, and begins to formulate a theory of logical questions through his consideration of systems of assertions—a theory that he more fully developed some years later. Although Cox's contributions to probability are acknowledged and have recently gained worldwide recognition, the significance of his work regarding logical questions is virtually unknown. The contributions of Richard Cox to logic and inductive reasoning may eventually be seen to be the most significant since Aristotle. Algebra of Probable Inference

This is one of the crowning achievements of the human race and is virtually unknown. No, I'm not exaggerating.

Richard Cox introduces three axioms that are little more than distilled common sense to establish an ordinal algebra of belief. In this algebra, one only seeks to assign a proposition a higher or lower degree of belief that it might be true. He then deduces the way that our *many* beliefs influence one another, still in an algebraically ordered way. A few *short* pages of *simple* algebra later he has derived an algebraic structure that includes the Laplace-Bayes version of probability theory as one of its special cases, that includes Shannon's Theorem as another (originally set down two years earlier than Shannon's work), and that provides a solid foundation for all of statistical physics as yet another.

That's not all. The ordinal logic he derives includes *all Aristotelian/Boolean logic* as a special limiting case, a limit of absolute belief in certain truth or falsehood that is essentially never realized in nature. Using his results one can put *all of human knowledge* on a solid logical foundation. For the first time *ever*, humankind can actually understand what it means to know something and express knowledge in terms of relative degrees of belief without a firm axiomatic foundation.

Cox's work does not *quite* refute David Hume's Skepticism, but it comes as close as it is logically possible to come to doing just that. At the end of it, instead of shrugging one's shoulders (as Hume shrugged his own) and continuing on in life as if science and mathematics work to describe nature without really understanding *why* we should believe that things will be in the future as we've seen them to be in the past, one emerges the other way around. Hume's objections have lost their sting -- Cox has shown us a way that we can *reasonably* be said to know a thing. At the very least, one comes to understand *how* we know what we think we know, and what our quantitative basis is (should we seek to evaluate it) for that knowing.

An awesome work. Too bad it is virtually unknown even in the science and mathematics crowd, let alone philosophy, although the work of E. T. Jaynes and David MacKay on plausible inference and information theory respectively are both derived from it. Goodreads is perhaps more attuned to readers of fiction than mathematics or non-bullshit philosophy, but if you *are* in the minority that likes the latter, you might give this one a try. The book is remarkably clear and Cox if anything is too humble.

rgb

127