Message #1852
From: David Vanderschel <DvdS@Austin.RR.com>
Subject: Re: [MC4D] Random Permutations and some Interesting References
Date: Sun, 07 Aug 2011 18:47:43 -0500
This is rather remotely related to cubing, so you may want
to skip it. It is a follow up, including a correction, on
my last post in this thread.
In my last message I quoted myself saying:
>In the MC2D state graph thread I wrote:
>>Another interesting point is that, with 5 unplaced cubies,
>>you can finish in only 2 steps whenever there are no stuck
>>cubies. In a random permutation of n things, the expected
>>number of unmoved things is 1 and the probability that all
>>are moved is 1/n; so it is fairly common to wind up with a
>>5-cycle when 5 cubies are unplaced. …
My statement that "the probability that all are moved is
1/n" is incorrect. The correct statement is that the
probability of an n-cycle is 1/n. (Permutations in which all
are moved are called derangements, and they are not so easy
to deal with.) It is true that the probability of a 5-cycle
is 2/5 for a random _even_ permutation of 5 things, since the
only way to get an even permutation for which all objects change
positions is a 5-cycle and the probability of a 5-cycle for
_any_ permutation is 1/5.
Regarding proving that the expected number of fixed positions
is 1, I wrote:
>…, I first searched on "random even permutation" to see
>if someone else had already analyzed the expected number of
>1-cycles issue for random even permutations. I did not
>find anything, but I was not so interested in the general
>case, so I was satisfied with the above enumeration. (And
>I think I now do know the answer in general, but I have not
>seen the proof.)
Now, I have _seen_ the proof! I emphasize "seen" because,
contrary to Melinda’s beliefs about how I think, a proof
came to me by visualizing something. The proof in the
Mackay paper I cited is in terms of probability, which
struck me as a somewhat odd way of looking at a rather
finite problem. As I was pondering Mackay’s proof, a simple
way to "see" it occurred to me:
Contemplating, "expected number of 1-cycles", one is
normally inclined to write something like
sum( k in [0,n] )( k*F(k) ) / n!,
where F(k) is the number of permutations for which k objects
are fixed. That line of reasoning gets you in trouble,
because F(k) is not very tractable. However, the sum that
is in the numerator can also be seen to be
sum( over all permutations p )( f(p) ),
where f(p) is the number of positions for which the
permutation p leaves the object in the position fixed, since
the number of occurrences of k=f(p) in the sum on p is F(k).
The thing that I visualized was an n! x n matrix of 1s and
0s in which the rows correspond to permutations and the
columns correspond to the possible positions of the objects
being permuted. For each permutation, there is a 1 in the
column corresponding to each position for which the
permutation is fixed (and 0s in any other columns). Thus
f(p) is the row-sum of the row for permutation p. Thus the
sum of the f(p) is the total number of 1s in the matrix.
The trick here is that it is much easier to count that
number of 1s if you do it by columns. For each position, it
is easy to see that there are (n-1)! permutations for which
that position is fixed (since we don’t really care how many
other positions are fixed by any of those permutations). So
adding the column sums trivially yields n!.
The argument is almost the same when you are only doing it
for even permutations. The column sums are all (n-1)!/2,
and the overall denominator is n!/2.
Regards,
David V.