Subject: linearity of expectation
propp@godel.mit.edu (Jim Propp) writes:
>In elementary probability one often encounters a random variable whose
>expected value is easy to work out if you write the random variable as
>a sum of suitably chosen random variables and use linearity of expectation,
>but is hard to work out if you work directly with the definition of
>expected value. I want to show my students at least one such problem.
>Any recommendations?
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
L i n e a r i t y o f E x p e c t a t i o n
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Article: 36643 of sci.math
From: kranzer@adx.adelphi.edu (Herbert Kranzer)
Organization: Adelphi University
Date: Wed, 20 Oct 1993 21:05:17 GMT
A very simple example is the binomial distribution, whose random
variable X can be regarded as the sum of n independent Bernouilli
variables with mean p and variance pq. This is a much shorter
route to both the mean E(X) = np and the variance Var(X) = npq
than either of the standard derivations (combinatorial or calculus).
Herbert Kranzer Dept. of Math. & Computer Sci.
kranzer@adx.adelphi.edu Adelphi University
Garden City, NY 11530
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Article: 36693 of sci.math
From: dph@wind.bellcore.com (Daniel P Heyman)
Organization: Bellcore
Date: Thu, 21 Oct 1993 16:06:13 GMT
Another example is the geometric distribution, P[X=i]=(1-p)p^(i-1),
i=1,2,... . Using the memoryless property and the linerarity of
expectation, we get E[X]=1+pE[X]=1/(1-p). This much easier than
using the definition and evaluating the infinite sum. There is also a
story one can make out of this. You're in a room with two doors; one
door leads to freedom and the other door leads to a tunnel that eventualy
returns to the room. The tunnel is attached to the left door with
probability p, and is reconnected each time you try to leave the room, all
connections being i.i.d. What is the expected time until you get out of
the room? (I'm not a good story teller, so you might want to improve on my
descriptions. The crucial point is that you return to the room with
probability p.)
--
Dan Heyman dph@bellcore.com
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Article: 36644 of sci.math
From: hrubin@pop.stat.purdue.edu (Herman Rubin)
Subject: Choice of definitions Was: linearity of expectation
Organization: Purdue University Statistics Department
Date: Wed, 20 Oct 1993 21:14:54 GMT
(Jim Propp) writes:
>In elementary probability one often encounters a random variable whose
>expected value is easy to work out if you write the random variable as
>a sum of suitably chosen random variables and use linearity of expectation,
>but is hard to work out if you work directly with the definition of
>expected value. I want to show my students at least one such problem.
>Any recommendations?
This is due to using a totally inappropriate "definition" of expectation.
The other "definitions" normally used in the textbooks are equally bad.
The use of these concept-obscuring cutenesses pervades much of mathematics.
Suppose we go back to the basics. If we make a random variable a
(measurable) function on a probability space, we already have confusion,
and in fact we have this confusion if we have a unique probabilty space
for a problem. There are many representations of the same situation,
and the events and random variables should be considered the same, but
with different representations in the different models.
As for expectation, it is a special case of integration. The use of
the cute formulas, different for discrete and absolutely continuous
random variables, can only be confusing. It is rather that the choice
of the representation, if it is correct, does not affect the answers
to probability questions which can be stated in that representation.
Since integrals are additive, this proves the result.
How much completeness can be done in a given course depends on the level.
But the use of the definitions in terms of the probability mass function
for discrete random variables and probability density functions for
absolutely continuous ones is a total MISTAKE, even if most textbooks
do it this way.
One can illustrate the choice of sample space, and that the result is
independent, even at the high school algebra level for finite problems.
--
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
Phone: (317)494-6054
hrubin@snap.stat.purdue.edu (Internet, bitnet)
{purdue,pur-ee}!snap.stat!hrubin(UUCP)
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
Article: 36756 of sci.math
From: wft@math.canterbury.ac.nz (Bill Taylor)
Organization: Department of Mathematics, University of Canterbury
Date: Fri, 22 Oct 1993 03:31:13 GMT
Hey ! Someone asking about one of my favorite corners of probability !
Or better said: "...if you work directly with the distribution of the
original variable itself."
It often strikes me as little short of a miracle that so many expected values
can be simply calculated as such sums, where the original r.v. has a filthy
disribution function that may be difficult or impossible to determine in full.
One doesn't often get such an invitation to wave about one's choicest examples!
There are dozens of these; you'll find plenty in probability books. But here
are a few of my favorites.
-------
The binomial distribution has already been mentioned. This could be extended
to hypergeometric types such as...
m defectives, n good ones; choose k at random. What is Exp # defectives chosen?
answer: sum of k "zero-one" r-vbls... E = km/(m+n)
~~~~~~
---------
n letters and n addressed envelopes; insert at random. What is Exp # correct?
answer: sum of n "zero-one" r-vbls... E = 1 . (independent of n !)
~~~~~~
----------
Line out a shuffled deck of cards. What is Exp # adjacent cards of same rank?
answer: sum of 51 "zero-one" r-vbls... E = 3 . (+1/17 if the line is a circle!)
~~~~~~
----------
8 guys and 7 gals get 15 seats in a line at random. Exp # mixed adjacent pairs?
answer: sum of 14 "zero-one" r-vbls... E = 7+(7/15)
~~~~~~
----------
k people in a lift at the ground floor. Each wants off at a random floor of
one of the n upper floors. What is expected # of lift stops ?
answer: sum of n "zero-one" r-vbls... E = n (1-(1-1/n)^k)
~~~~~~
================
All of the above are examples where the summand random variables are "zero-one"
types; i.e. take on only two possible sample values. But there are others
as well. Another one of my pets is...
----------
Keep collecting cards, each one randomly chosen from a basic set of n cards.
What is Exp # collects till you have the full set?
answer: sum of n different geometric r-vbls... E = n(1 + 1/2 + 1/3 +...+ 1/n).
~~~~~~
This is nice, as it stresses how looooong it takes to get the last one or two,
(as we all know from childhood!)
================
And if you're into continuous types, here are some more goodies...
----------
Mark a unit-length stick at n random points. What is Exp length of leftmost bit?
answer: sum of (n+1) bits is 1. Thus E(bit-length) = 1/(n+1).
~~~~~~
NOTE: bits are exchangeable; (think of the line bent into a circle, with the
join being an (n+1)th random point.
This has a nice discrete equivalent: Exp # cards before first ace?
answer: 9.6
~~~~~~
--------------
My most favorite of all: Solve the Buffon needle probability.
===================================
Grid lines unit distance apart; toss a unit needle; what prob it crosses a line?
answer: Let P = expected # crossings.
~~~~~~ Expected(# crossings from a needle of length L) = LP. ......(1)
(Why? think of the needle as a sum of little needles end to end !)
Now: the same answer holds even if the needle is bent, and even if it is
floppy, like spaghetti !! Why? exactly as before! Only the length need
be fixed.
So then: take a needle of length pi, and bend it into a half-unit circle!
Expected(# crossings) = 2. (The # crossings is constant in this case, of course!)
Thus 2 = pi.P (from line 1)
thus P = 2/pi
~~~~~~~~
====================== TA-DAH !!! =======================
Truly, that theorem, E[X1+X2+X3+...+Xn] = E[X1]+...+E[Xn] ,
is a little miracle !!!
-------------------------------------------------------------------------------
Bill Taylor wft@math.canterbury.ac.nz
-------------------------------------------------------------------------------
"We're pushing back the frontiers of knowledge!"
"Yes; but from which side?"
-------------------------------------------------------------------------------
From: Matthew Cook [SMTP:cook@wolfram.com]
Subject: Re: A mating problem.
Date: 6/11/98 5:02 PM
Problem:
Suppose we have a bag with n red marbles and n blue ones,
and we mate these by picking out couples at random.
What is the expected number of red-blue couples?
Solution:
Considering x red marbles and y blue marbles (where x+y is even),
we can empirically find the following formula:
x y
Expected Number of Red-Blue Couples = ---------
x + y - 1
This can then be proven by a straightforward induction on x+y.
Or, here's the "nice" proof:
What is the chance of the first pair being a red-blue matching?
x y y x 2 x y
It is ----- --------- + ----- --------- == -------------------
x + y x + y - 1 x + y x + y - 1 (x + y) (x + y - 1)
But we can pick all the pairs at once, so really every pair must
have this probability of being a red-blue match, so the expected
number of red-blue matchings is that times the number of pairs:
2 x y x + y x y
------------------- . ----- == ---------
(x + y) (x + y - 1) 2 x + y - 1
The same strategy gives us the expected number of tricolor triples for
x red, y white, and z green marbles to be:
6 x y z x + y + z
------------------------------------------- . --------- ==
(x + y + z) (x + y + z - 1) (x + y + z - 2) 3
2 x y z
== -------------------------------
(x + y + z - 1) (x + y + z - 2) 3
2 n
And in particular, for n of each, we get -------------
(3n-1) (3n-2)
And why stop now -- picking k-tuples, starting with n of each,
(k-1)! n^k (kn-k)!
we expect to get ------------------ groups containing one of each color.
(kn-1)!
-Matthew Cook