<<

. 8
( 10 .)



>>

Problem 19.5. Prove each of the following.
1. Let “ be a complete set of sentences of LN such that “ ∪ A is
consistent. Then “ is not recursive.
53
54 19. THE INCOMPLETENESS THEOREM

2. Let ∆ be a recursive set of sentences such that ∆∪A is consistent.
Then ∆ is not complete.
3. The theory of N,
Th(N) = { σ | σ is a sentence of LN and N |= σ } ,
is not recursive.
There is nothing all that special about working in LN . The proof
of G¨del™s Incompleteness Theorem can be executed for any ¬rst order
o
language and recursive set of axioms which allow one to code and prove
enough facts about arithmetic. In particular, it can be done whenever
the language and axioms are powerful enough ” as in Zermelo-Fraenkel
set theory, for example ” to de¬ne the natural numbers and prove some
modest facts about them.
G¨del also proved a strengthened version of the Incompleteness
o
Theorem which asserts that a consistent recursive set of sentences Σ of
LN cannot prove its own consistency. To get at it, we need to express
the statement “Σ is consistent” in LN .
Problem 19.6. Suppose Σ is a recursive set of sentences of LN .
Find a sentence of LN , which we™ll denote by Con(Σ), such that Σ is
consistent if and only if A Con(Σ).
Theorem 19.7 (G¨del™s Second Incompleteness Theorem). Let Σ
o
be a consistent recursive set of sentences of LN such that Σ A. Then
Σ Con(Σ).
As with the (First) Incompleteness Theorem, the Second Incom-
pleteness Theorem holds for any recursive set of sentences in a ¬rst-
order language which allow one to code and prove enough facts about
arithmetic. The perverse consequence of the Second Incompleteness
Theorem is that only an inconsistent set of axioms can prove its own
consistency . . .

The implications. G¨del™s Incompleteness Theorems have pro-
o
found implications.
Since almost all of mathematics can be formalized in ¬rst-order
logic, the First Incompleteness Theorem implies that there is no e¬ec-
tive procedure that will ¬nd and prove all theorems. This might be
considered as job security for research mathematicians . . .
The Second Incompleteness Theorem, on the other hand, implies
that we can never be completely sure that any reasonable set of axioms
is actually consistent unless we take a more powerful set of axioms on
faith. It follows that one can never be completely sure ” faith aside ”
19. THE INCOMPLETENESS THEOREM 55

that the theorems proved in mathematics are really true. This might
be considered as job security for philosophers of mathematics . . .
Truth and de¬nability. A close relative of the Incompleteness
Theorem is the assertion that truth in N = (N, S, +, ·, E, 0) is not
de¬nable in N. To make sense of this, of course, we ¬rst need to de¬ne
what “de¬nable in N means.
Definition 19.1. A k-place relation is de¬nable in N if there is a
formula • of LN with at most v1, . . . , vk as free variables such that
P (n1 , . . . , nk ) ⇐’ N |= •[s(v1|n1 ) . . . (vk |nk )]
for every assignment s of N. Such a formula • is said to de¬ne P in
N.
A de¬nition of “function de¬nable in N” could be made in a similar
way, of course. De¬nability is a close relative of representability:
Proposition 19.8. Suppose P is a k-place relation which is rep-
resentable in Th(A). Then P is de¬nable in N.
Problem 19.9. Is the converse to Proposition 19.6 true?
A counterpart for de¬nability of the Entscheidungsproblem is the
question of whether the truth in N is a de¬nable relation in N, i.e.
whether the set of G¨del codes of sentences of LN true in N,
o
Th(N) = { σ | σ is a sentence of LN and N |= σ } ,
is de¬nable in N.
Theorem 19.10 (Tarski™s Unde¬nability Theorem). Th(N) is
not de¬nable in N.
56 19. THE INCOMPLETENESS THEOREM
Hints
CHAPTER 10


Hints

10.1. This should be easy . . .
10.2. Ditto.
1. Any machine with the given alphabet and a table with
10.3.
three non-empty rows will do.
2. Every entry in the table in the 0 column must write a 1 in the
scanned cell; similarly, every entry in the 1 column must write a
0 in the scanned cell.
3. What™s the simplest possible table for a given alphabet?
10.4. Unwind the de¬nitions step by step in each case. Not all of
these are computations . . .
10.5. Examine your solutions to the previous problem and, if nec-
essary, take the computations a little farther.
10.6. Have the machine run on forever to the right, writing down
the desired pattern as it goes no matter what may be on the tape
already.
10.7. Consider your solution to Problem 10.6 for one possible ap-
proach. It should be easy to ¬nd simpler solutions, though.
1. Use four states to write the 1s, one for each.
10.8.
2. The input has a convenient marker.
3. Run back and forth to move one marker n cells from the block
of 1™s while moving another through the block, and then ¬ll in.
4. Modify the previous machine by having it delete every other 1
after writing out 12n .
5. Run back and forth to move the right block of 1s cell by cell to
the desired position.
6. Run back and forth to move the left block of 1s cell by cell
past the other two, and then apply a minor modi¬cation of the
previous machine.
7. Run back and forth between the blocks, moving a marker through
each. After the race between the markers to the ends of their
59
60 10. HINTS

respective blocks has been decided, erase everything and write
down the desired output.
CHAPTER 11


Hints

11.1. This ought to be easy.
11.2. Generalize the technique of Example 11.1, adding two new
states to help with each old state that may cause a move in di¬erent
directions. Be careful not to make a machine that would run o¬ the
end of the tape when the original wouldn™t.
11.3. Note that the simulation must operate with coded versions
of Ms tape, unless Σ = {1}. The key idea is to use the tape of the
simulator in blocks of some ¬xed size, with the patterns of 0s and 1s
in each block corresponding to elements of Σ.
11.4. This should be straightforward, if somewhat tedious. You do
need to be careful in coming up with the appropriate input tapes for
O.
11.5. Generalize the technique of Example 11.2, splitting up the
tape of the simulator into upper and lower tracks and splitting each
state of N into two states in P . You will need to be quite careful in
describing just how the latter is to be done.
11.6. If you™re in doubt, go with one read/write scanner for each
tape, and have each entry in the table of a two-tape machine take
both scanners into account. Simulating such a machine is really just a
variation on the techniques used in Example 11.2.
11.7. Such a machine should be able to move its scanner to cells
up and down from the current one, as well to the side. (Diagonally too,
if you want to!) Simulating such a machine on a single tape machine is
a challenge. You might ¬nd it easier to ¬rst describe how to simulate
it on a suitable multiple-tape machine.




61
62 11. HINTS
CHAPTER 12


Hints

12.1. Pick as simple a Turing machine as you can think of . . .
12.2. Unwind the representation using De¬nition 12.2.
12.3. Trick question!
12.4. One could omit the representations of any empty entries in
the table of a Turing machine, for one thing.
12.5. Just use De¬nition 12.3 in each case.
12.6. Unwind each representation using De¬nition 12.3.
12.7. For one thing, is the leading 1 in the representation of each
cell really necessary?
12.8. H needs to make exactly two changes to the representation
of the tape position. Note that i = 0 is a special case.
12.9. R needs to check the representation of a particular cell.
12.10. W needs to make just one change to the representation of
the tape position.
12.11. E must ¬rst ¬nd and then copy the representation of par-
ticular entry in the table of M. Some of the machines in Problem 10.8
may come in handy here.
12.12. Put the machines developed in Problems 12.8“12.11 to-
gether.
12.13. Use the machine S of Problem 12.12 to do most of the work.
12.14. Essentially, all C does is unwind De¬nition 12.3.
12.15. Assume, by way of contradiction, that there was a Turing
machine T which solved the Halting Problem. Modify T to get a
machine Y which, for any Turing machine M, halts on input 0 M if
and only if M does not halt on input 0 M . Feed the input 0 Y to
Y ...


63
64 12. HINTS
CHAPTER 13


Hints

1. Delete most of the input.
13.1.
2. Add a little to the input.
3. Add a little to the input, and delete a little more elsewhere.
4. Delete a little from the input most of the time.
5. Run back and forth between the two blocks in the input, deleting
until one side disappears. Clean up appropriately!
6. Delete two of blocks and move the remaining one.
13.2. There are just as many functions N ’ N as there are real
numbers, of which there are many more than there are natural numbers.
1. Trace the computation through step-by-step.
13.3.
2. Consider the scores of each of the 1-state entries in the busy
beaver competition.
3. Find a 3-state entry in the busy beaver competition which scores
six.
4. Show how to turn an n-state entry in the busy beaver competi-
tion into an (n + 1)-state entry that scores just a little better.
13.4. You could start by looking at modi¬cations of the 3-state
entry you devised in Problem 13.3.
13.5. Suppose Σ was computable by a Turing machine M. Modify
M to get an n-state entry in the busy beaver competition for some n
which achieves a score greater than Σ(n).




65
66 13. HINTS
CHAPTER 14


Hints

14.1. You only need to take care of the projection functions, and
these can be computed by Turing machines very similar to one another.
14.2. Generalize Example 14.1.
14.3. Use machines computing g, h1, . . . , hm as sub-machines of
the machine computing the composition. You might also ¬nd sub-
machines that copy the original input and various stages of the output
useful. It is important that each sub-machine get all the data it needs
and does not damage the data needed by other sun-machines.
14.4. Proceed by induction on the number of applications of com-
position used to de¬ne f from the initial functions.
1. Exponentiation is to multiplication as multiplication is
14.5.
to addition.
2. This is straightforward except for taking care of Pred(0) =
Pred(1) = 0.
3. Diff is to Pred as S is to Sum.
4. This is straightforward if you let 0! = 1.
14.6. Machines used to compute g and h are the principal parts
of the machine computing f, along with parts to copy, move, and/or
delete data on the tape between stages in the recursive process.
1. f is to g as Fact is to the identity function.
14.7.
2. Use Diff and a suitable constant function as the basic building
blocks.
3. This is a slight generalization of the preceding part.
14.8. Proceed by induction on the number of applications of prim-
itive recursion and composition.
1. Use a composition including Diff, χP , and a suitable
14.9.
constant function.
2. A suitable composition will do the job; it™s a little harder than
it looks.
67
68 14. HINTS

3. A suitable composition will do the job; it™s much more straight-
forward than the previous part.
4. Note tht n = m exactly when n ’ m = 0 = m ’ n.
5. Compare this with Problem 14.5.
6. First devise a characteristic function for the relation
Divides(n, k, m) ⇐’ nk = m ,

<<

. 8
( 10 .)



>>