Wednesday, July 12, 2006

Strange things happen to me, and I am left to wonder if I am the only person who experiences these things. Every now and then, events in my life would cascade in a pattern, each piece randomly falling into place to form the suggestion of some larger picture. Being the kind of person I am, even such a shadow the possibility of something new to know is irresistable. At least until the next shiny thing comes along.

Recently, the issue seems to be Godel's Incompleteness Theorem. I'm seeing echoes of it everywhere, and the randommest (yay new word) conversations would suddenly involve it. So, in an effort at catharsis, I will blabber a little about it here.

Godel proves that any sufficiently rich system, if consistent, cannot be complete. I use 'consistent' and 'complete' here technically, to mean the following:

A system is consistent if there is no statement within that system that can be proven both true and false.
A system is complete if every statement within that system can be proven either true or false.


Godel's original formulation is as such:
To every ω-consistent recursive class κ of formulae there correspond recursive class signs r, such that neither v Gen r nor Neg(v Gen r) belongs to Flg(κ) (where v is the free variable of r).

Of course, this is incomprehensible gibberish. I am proud to say that there was a point in time when I understood the full proof, but ashamed to say that I can't even remember what Gen refers to.

A better formulation might be this:
Any consistent system (of at least a basic level of complexity) is incomplete.

This means that, for instance, elementary arithmetic is incomplete. There are statements within elementary arithmetic that cannot be proven either true or false, using the rules of elementary arithmetic itself. This applies for just about any system you can imagine. Consider the implications for our self-knowledge, or for AI. Even if we manage to straighten out all the kinks in the works and arrive at a clean, consistent system that represents our minds or an artificial mind, there would be propositions within the system that cannot be shown to be true or false within the system itself! So much for the dream of downloading our consciousnesses into a software replica, to live forever in a digital world. We cannot represent ourselves totally!

Consider the following example; It's a fair translation of Godel's proof into something normal people can understand. To be clear, Godel's proof is far, far, more rigorous, written as it is in first-order logic and mathematics. If you desire to find out more, let me know, I'll be happy to share.

  1. Someone introduces Gödel to a UTM, a machine that is supposed to be a Universal Truth Machine, capable of correctly answering any question at all.
  2. Gödel asks for the program and the circuit design of the UTM. The program may be complicated, but it can only be finitely long. Call the program P(UTM) for Program of the Universal Truth Machine.
  3. Smiling a little, Gödel writes out the following sentence: "The machine constructed on the basis of the program P(UTM) will never say that this sentence is true." Call this sentence G for Gödel. Note that G is equivalent to: "UTM will never say G is true."
  4. Now Gödel laughs his high laugh and asks UTM whether G is true or not.
  5. If UTM says G is true, then "UTM will never say G is true" is false. If "UTM will never say G is true" is false, then G is false (since G = "UTM will never say G is true"). So if UTM says G is true, then G is in fact false, and UTM has made a false statement. So UTM will never say that G is true, since UTM makes only true statements.
  6. We have established that UTM will never say G is true. So "UTM will never say G is true" is in fact a true statement. So G is true (since G = "UTM will never say G is true").
  7. "I know a truth that UTM can never utter," Gödel says. "I know that G is true. UTM is not truly universal."
-Rucker, Infinity and the Mind

It's a little tough to wrap your mind completely around, but if you do, grasping it should feel a little in the nature of a revelation.

2 Comments:

Blogger Zim said...

Poon. What if we were inconsistent, or incomplete? Would it be impossible to create a replica of ourselves which is inconsistent or incomplete?

1:24 PM  
Blogger WhatRoughBeast said...

I think Godel's thing strictly only applies for formal systems. We don't think in formal systems. Give me a few days to read up a little on it, I'm want to say that the theorem is translatable and meaningful for non-first-order-logic stuff, but I'm not quite sure how yet.

2:10 AM  

Post a Comment

<< Home