God, computers, and complexity theory

(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: lawrence@krubner.com

How do we measure complexity? Is Godel’s Incompleteness Theorem just another way of stating Turing’s Halting Problem? A very interesting combination of ideas:

For Leibniz, a law must be expressed by a simple equation. An equation that is forte composée is meaningless. There is always one loitering about. This kind of theoretical criterion does not apply to individual cases. We are trying to understand what it means to say there is a law governing these points. That does not mean we can determine if there is such a law.

Well, maybe, if you limit yourself to algebraic equations…

It was in paragraph V of the Discours that Leibniz asserted that the world is comprehensible. He wrote this in 1686, a year before Isaac Newton published the Principia. Modern science or, as it was called then, mechanical philosophy, was just beginning. Medieval ideas about the world and religion were still very much alive. Newton himself was more a man of the past than of the future; he was, as John Maynard Keynes put it, the last of the Babylonian sorcerers.4

Just as the old school of thought was colliding with the new mechanical philosophy, Leibniz asked why the world is comprehensible. He wanted to combine theology with modern science; he was unwilling to give up either. In the Discours, he discussed the relation of science to the world, noting that the world is governed by science, and explaining why. The reason, he wrote, is God. God has created a world which is very rich and diverse in phenomena. And yet all this richness follows from a few very simple laws. According to Leibniz, this is what it means to say that the world is comprehensible. The world is beautiful because it follows simple laws that produce the richness that we see around us. Science is possible, and this fact was, for Leibniz, proof of the greatness of God; this beautiful world is made by a great artist from a small number of powerful ideas.

And then:

There is a way to take these ideas and make them sharper. There is, in fact, a way to define them mathematically. Consider this done. The result is algorithmic information theory, or AIT.

At the center of AIT is the following model of how science should be done:

Input: program → COMPUTER → Output: experimental data.

The input to the computer is a program; the output, experimental data. The computer itself is a fixed universal Turing machine.

Gone are points or dots of ink on a piece of paper; and gone with those dots is anything like a theory or mathematical equation. No interpolations; no Lagrangians. In this model, both the input and the output comprise a finite sequence of 0s and 1s. The result is a soft view of the hard science. Theories and data are both finite strings of bits. All that enters the computer is a program. There is no distinction between input data and the methods or tools used to manipulate them. The data is in the program, so that the input is rather like a class in object-oriented programming languages.

It is all in there together.

Given a theory as its input, and some internal signal to get to work, the computer initiates its calculations, and, one hopes, produces an output before halting. If the output is exactly the string of bits comprising the experimental data, then the theory serves as its explanation.

No approximations! This is not a probabilistic or a statistical view of science.

Leibniz fitted this view of things like a glove. We can compare the complexity of a theory with the complexity of the data very easily because both are finite strings. Leibniz saw that a theory the same size as the data is useless in the sense that it is not possible to cover a debt of ten dollars with another debt of ten dollars. Explanation is a form of compression. If a theory is smaller than the data, then in that case, as in so many others, less is more. A successful explanation is a matter of covering a large debt with a much smaller one.

Source