BookPr0n 0: Gregory Chaitin’s Algorithmic Information Theory
In a (hopefully) ongoing series of blog posts named BookPr0n, I will highlight and briefly describe some interesting and (at times) obscure books that I come across. The books will tend to be technological in nature, but not exclusively so.
The first in the series is the first book published by the immortal Gregory Chaitin titled Algorithmic Information Theory (AIT – also available online or for purchase). It should be noted that my copy is the 1st edition, whereas the link above is for the current 3rd edition. I have been looking for some time to purchase and read Chaitin’s book The Unknowable for a few years now, so when I came across AIT on Paperback Swap I had to grab it.
The cover is a bit hard on the eyes, but it does follow the scheme for the Cambridge Tracts in Theoretical Computer Science series; for better or for worse.
Eleven pages into the book we find Pascal’s Triangle which for me is a huge win as far as pure geekery goes.
Later pages perfectly illustrate what this book brings to the table. On first glance they appear to be the ramblings of an insane mind, but there is logic to this seeming chaos. First, in chapter 3 Chaitin defines a simplified version of LISP and writes three versions of the code for the eval
function (which also happens to be a universal Turing machine) using this language. In chapter 4 he converts his definition of simplified LISP into a register machine and then compiles that into an exponential diophantine equation (which incidentally was written in REXX).
So the question arises: what’s the point? “Simply put”, Chaitin’s goal in performing the above steps is to create an equation that represents a universal turing machine so that he can then calculate by random evaluation and encode an approximation to the halting probability (which remember, would also correspond to a LISP program as a exponential diophantine equation) leading the conclusion that this halting probability is an programmatic way to represent Gödel’s incompleteness theorems. In other words, there is no possible way to tell if his, or any other program, has a finite or infinite set of programs that will run to their conclusion.
Further information can be found on Chaitin’s homepage and his book Meta Math: The Quest for Omega.
note: The first 64-bits of Chaitin’s constant have been shown (in decimal form) to be 0.0078749969978123844.
-m
No Comments, Comment or Ping
Reply to “BookPr0n 0: Gregory Chaitin’s Algorithmic Information Theory”