ABSTRACT

This rule connects union of events with v applied to probabilities. It is the actual addition rule of probability, although another equation is usually called by this name.

Recall that independence of A and B means the same as P[AB] = P[A]P[B]. But from the result of the preceding paragraph, we could just as easily make the independence condition P[AuB] = P[A]vP[B]. To pursue this duality, suppose that for any A arid B, I define n by

Then n is a measure of the positive dependence between A and B. When TOl then P[AB] > P[A]P[B], or equivalently P[A|B]>P[A]. This is what positive dependence means, that when one of the events has happened, the conditional probability of the other is increased. Further suppose I define vby

P[AuB] = P[A]vP[B]vv Although it is not so obvious, v is a measure of negative dependence between A and B. Substituting for the left side above

P[B]vP[A|B*] = P[A]vP[B]vv and \-subtracting P[B] gives P[A|B*] = P[A]vv, so that v>0 implies P[A|B*]>P[A], or equivalently P[A|B]<P[A]. This is what negative dependence means, that the occurrence of one event lowers the conditional probability of the other. A further way to see the role of v is to verify that

= P[A*]P[B*]v* Another important general formula from probability theory is the so-

called chain rule,

P[A1A2A3...] - which is obtained by simply using the multiplication rule over and over. The corresponding v-rule is

P[A,uA2uA3...] = P[A1]vP[A2|A1*]vP[A3|A2*AJ*]... which is obtained by similarly iterating the addition rule.