From nowick@cs.columbia.edu Wed Jun 14 16:55:46 1995
Date: Wed, 14 Jun 95 11:48:20 EDT
From: Steven Nowick
To: asynchronous-private@cs.columbia.edu
Subject: [Mark Greenstreet : Carry-chain length]
Reply-To: asynchronous@hohum.stanford.edu
Content-Length: 3214
X-Lines: 60
Status: RO
Received: from cs.columbia.edu (cs.columbia.edu [128.59.10.13]) by ober.cs.columbia.edu (8.6.12/8.6.6) with ESMTP id BAA27185 for ; Wed, 14 Jun 1995 01:25:02 -0400
Received: from hohum.Stanford.EDU (hohum.Stanford.EDU [36.22.0.200]) by cs.columbia.edu (8.6.12/8.6.6) with SMTP id BAA11881 for ; Wed, 14 Jun 1995 01:25:01 -0400
Received: by hohum.Stanford.EDU (5.57/Ultrix3.0-C)
id AA10742; Tue, 13 Jun 95 21:54:15 PDT
Received: from venice.cs.ubc.ca (venice.cs.ubc.ca [142.103.11.2]) by grolsch.cs.ubc.ca (8.6.10/8.6.9) with SMTP id WAA21520 for ; Tue, 13 Jun 1995 22:24:57 -0700
Date: Tue, 13 Jun 1995 22:24:57 -0700
From: Mark Greenstreet
Message-Id: <199506140524.WAA21520@grolsch.cs.ubc.ca>
To: asynchronous@hohum.stanford.edu
Subject: Carry-chain length
Content-Type: text
Content-Length: 2310
> Does anybody know/have the reference to the Von Neuman paper where he
> proves that the average length of the longest carry chain in an
> addition is logarithmic on the number of bits?
I'm at home and don't have the reference here, but if all you want to know
is that it's logarithmic and you don't need the constant, you can do the proof
on the back of a postage stamp:
general observations:
I assume that you assume uniformly distributed operands in the range
0..(2^n - 1). At each stage, their are eight combinations of carry_in
and the bits from the two input words, four of these produce carries,
and four do not. A simple Markov chain argument shows that the probability
of a carry-out from the j^th stage approaches 0.5 exponentially as j goes
to infinity regardless of what the carry-in is as stage 0.
It is also easy to see that a stage has probability 0.5 of propagating
a carry (regardless of it's position in the chain).
lower bound
What is the probability that there is a carry chain of length at least
log2(n)/2 ? An lower bound is obtained by considering 2n/log2(n)
non-overlapping segments. Each such segment has a probability of
0.5/sqrt(n) of propagating a carry across the chain (the leading 0.5 is
the probability of having a carry in). The probability that a carry is
propagated across none of these stages is
(0.5/sqrt(n))^(2n/log2(n))
which is less than or equal to 2^(-8) for n >= 4. So, for n >= 4, the
length of the average carry chain is at least
0.25*(1-2^(-8))*log2(n)
upper bound
What is the probability that there is a carry chain of length greater
than 2*log2(n)? An upper bound is obtained by considering n overlapping
chains of length 2*log2(n). Each has probability 0.5/(n^2) of
propagating a carry across the chain. Since the length of such a chain
is at most n, the contribution of chains of length greater than 2*log2(n)
to the average is at most 0.5/n. So, the length of the average carry
chain is at most
2*log2(n) + 0.5/n.
This gives upper and lower bounds that differ by about a factor of 8.
I haven't read the von Neumann paper, but it certainly gives tighter
bounds since he was a better mathematician than I, and he probably used
a napkin instead of a postage stamp for his proof.
--Mark