When 0 makes a big difference
Dec. 8th, 2011 08:28 pmSaw (well, answered) an interesting question posted on MathOverflow by Vladimir Reshetnikov recently. Now of course it's in the link but let me take a moment to restate it myself. :)
We're going to recursively define a class S of functions from N to N. Wait -- when I say N, am I including 0 or not? Well, the original problem statement included 0, but it left that it was doing this implicit in the fact that one of the starting functions was f(n)=0.
Anyway. The functions are as follows. To start, we have constant 0, constant 1, the identity function, and the successor function. We have one combination rule: Given f, g, and h in S, we can form a new function l in S via l(n)=fg(n)(h(n)) (where here the exponent represents iteration).
Now, I haven't even stated the actual question yet, but let's take note -- of course the same thing could be considered, but with 0 excluded from N, and constant 0 removed from the starting functions. But the results will have a very different flavor. Do you see why?
If you've ever seen Church numerals before, you're familiar with the trick -- there's a lot of power in the ability to iterate a function 0 times. Rather, to iterate it a number of times which is potentially 0 and potentially not. Especially when that function is idempotent, so that the 0-vs-nonzero distinction is all that matters. Such as, say, when the function is constant.
This allows you to build all sorts of conditionals that you couldn't without 0. The important thing is not that constant 0 function; the important part is the identity function's ability to be 0 some of the time.
So let me state now what the question originally was. Well, OK, a slightly changed form of it. The poster asked, suppose we consider this set with the order f≤g if f≤g eventually. Is this a well-order?
Well, in fact it's not even an order; you can f and g in S with f≠g but f=g eventually. And if we take equivalence classes, it's still only a partial order. And it's not a well-partial-order -- there's an infinite antichain. Nor is it even a well-founded partial order; I managed to find an infinite decreasing sequence.
But I've actually changed the question slightly -- the original question attempted to define an order not by f≤g if f≤g eventually, but rather f<g if f<g eventually. To state that this is a total order is rather stronger than to state the order I defined above is total! But just because you found an infinite decreasing sequence in the order I defined, doesn't mean you've found an infinite decreasing sequence in "this" order. But I eventually I found this too, finding a way to construct Fk(n)=max(0,2(n-k)). So I think I've rather thoroughly answered the question as originally asked -- it's not even slightly a well-order[0].
(Of course, the obvious way to get a counterexample would be to construct the predecessor function and its iterates, but I think this is impossible. I thought I had a proof, actually, but it turned out to be mistaken.)
But if we remove 0, as mentioned above, the problem takes on a very different flavor; all these "programming" tricks become impossible. Well, perhaps there's some way to construct some "piecewise" things, but straight-up boolean tests are impossible; you can show, for instance, that with this restriction, every function you can make is one of constant, the identity function, or strictly monotonic and satisfying f(n)≥n+1. And in this setting, the poster's original idea -- that his stronger order not only totally orders S but also well-orders it -- seems a lot more plausible. Notice that by what I said above, constructing a predecessor function is *definitely* impossible in this setting (as are the functions I used to actually pull it off, though again, that doesn't rule out doing a similar thing with larger functions; it just makes it seem very unlikely).
But actually proving that would be hard, and, as I said, would have a very different flavor, so I'm not going to think about that...
-Harry
[0]Though my answer hasn't been accepted. :P
EDIT 31 Dec: Now it is. :)
We're going to recursively define a class S of functions from N to N. Wait -- when I say N, am I including 0 or not? Well, the original problem statement included 0, but it left that it was doing this implicit in the fact that one of the starting functions was f(n)=0.
Anyway. The functions are as follows. To start, we have constant 0, constant 1, the identity function, and the successor function. We have one combination rule: Given f, g, and h in S, we can form a new function l in S via l(n)=fg(n)(h(n)) (where here the exponent represents iteration).
Now, I haven't even stated the actual question yet, but let's take note -- of course the same thing could be considered, but with 0 excluded from N, and constant 0 removed from the starting functions. But the results will have a very different flavor. Do you see why?
If you've ever seen Church numerals before, you're familiar with the trick -- there's a lot of power in the ability to iterate a function 0 times. Rather, to iterate it a number of times which is potentially 0 and potentially not. Especially when that function is idempotent, so that the 0-vs-nonzero distinction is all that matters. Such as, say, when the function is constant.
This allows you to build all sorts of conditionals that you couldn't without 0. The important thing is not that constant 0 function; the important part is the identity function's ability to be 0 some of the time.
So let me state now what the question originally was. Well, OK, a slightly changed form of it. The poster asked, suppose we consider this set with the order f≤g if f≤g eventually. Is this a well-order?
Well, in fact it's not even an order; you can f and g in S with f≠g but f=g eventually. And if we take equivalence classes, it's still only a partial order. And it's not a well-partial-order -- there's an infinite antichain. Nor is it even a well-founded partial order; I managed to find an infinite decreasing sequence.
But I've actually changed the question slightly -- the original question attempted to define an order not by f≤g if f≤g eventually, but rather f<g if f<g eventually. To state that this is a total order is rather stronger than to state the order I defined above is total! But just because you found an infinite decreasing sequence in the order I defined, doesn't mean you've found an infinite decreasing sequence in "this" order. But I eventually I found this too, finding a way to construct Fk(n)=max(0,2(n-k)). So I think I've rather thoroughly answered the question as originally asked -- it's not even slightly a well-order[0].
(Of course, the obvious way to get a counterexample would be to construct the predecessor function and its iterates, but I think this is impossible. I thought I had a proof, actually, but it turned out to be mistaken.)
But if we remove 0, as mentioned above, the problem takes on a very different flavor; all these "programming" tricks become impossible. Well, perhaps there's some way to construct some "piecewise" things, but straight-up boolean tests are impossible; you can show, for instance, that with this restriction, every function you can make is one of constant, the identity function, or strictly monotonic and satisfying f(n)≥n+1. And in this setting, the poster's original idea -- that his stronger order not only totally orders S but also well-orders it -- seems a lot more plausible. Notice that by what I said above, constructing a predecessor function is *definitely* impossible in this setting (as are the functions I used to actually pull it off, though again, that doesn't rule out doing a similar thing with larger functions; it just makes it seem very unlikely).
But actually proving that would be hard, and, as I said, would have a very different flavor, so I'm not going to think about that...
-Harry
[0]Though my answer hasn't been accepted. :P
EDIT 31 Dec: Now it is. :)