Probability Part III

What Are Modal Verbs of Probability? | BKA Content

Complementary Events:
    Another important concept in probability theory is that of complementary events. When is die is  rolled, for instance, the sample space consists outcomes 1, 2, 3, 4, 5, and 6. The event E of getting odd numbers consists of the outcomes I, 3, and 5. The event of not getting odd  numbers  is called the complement of event E and it consists of outcomes 2,4, and 6.

The complement of an event E is the set of outcomes in the sample space  that are not included in the outcomes of event E. The complernent of E is denoted by E̅  (read "E bar)

Further illustrates the concept of complementary events through example.

Find the complement of each event.
a. Rolling a die and getting a 4.

b. Selecting a letter of the alphabet and getting a vowel.

C. Selecting a month and getting a month that begins with a J.

d. Selecting a day of the week and getting a weekday.

Solution

a.     S={1,2,3,4,5,6}
       Complement of 4 is
       Getting a 1, 2, 3, 5, or 6.

b.    S={A,B,C,D,.........Z}
       Complement of Vowel (A E I O U)
       Getting a consonant (assume y is a consonant).

c.    S={Jan, Feb, March, April, May, June, July, Aug, Sep, Oct, Nov, Dec}
       Complement of Jan, June, July
       Getting February, March, April, May, August, September, October, November, or December.

d.    S= {S, M, T, W, T ,F ,S }
       Getting Saturday or Sunday.


The outcomes of an event and the outcomes of the complement make up the entire sample space.
 For example, if two coins are tossed, the sample space is HH, HT, TH, and TT. The complement of "geting all heads" is not "getting all tails," since the "all heads" event is HH, and the complement of HH is HT, TH, and TT. Hence, the complement of the event "all heads" is the event "getting at least
one tail."
Since the event and its complement make up the entire sample space, if  it follows that the sum of the probability   of the event and the probability of  its complement will  equal  1.
   That is, P(E) +  P() = 1.
In the above  example,
 let E = all heads or HH and let E̅ = at least one tail, or HT,TH, TT.
Then
          P(E) = 1/4

          P() = 3/4
 
hence, P(E) + P(E) = 1/4 = 3/4 = 1

Remark: / (Divide by)

The rule for complementary events can be stated algebraically in three ways.

Rule for Complementary Events:

P() = 1- P(E) 
        or 
P(E) = 1-P(
         or 
P(E) + P(E) = 1

Stated in words, the rule is: If the probability of an event or the probability of its  complement is known,  then the other can be found by subtracting the  probability from 1.  This rule  is important in probability theory  because  at times the best solution  to a problem is to find the probability of the complement of an event and then subtract from 1 to get the probability of the event itself,


Example:
If the probability that a person lives in an industrialized  country of the world is 1/5,
find the probability  that a person does country not live  in an industrialized  country.

Solution:
P(not living in an industrialized country) = 1- P (living in an industrialized country)
 
= 1 - 1/5
=    4/5.
    
   
              Probabilities can be represented pictorially by Venn diagrams. Below figure (a)  shows the probability of a simple event E. The area inside the circle represents the probability of event E that is, P(E).
 The area inside the rectangle represents the probability of all the events in the sample space, P(S).
     The Venn diagram that represents the probability of the complement of an event P() is shown in Figure (b),
 In this case,  P() = 1 - P(E),
which is the area inside the rectangle but outside the circle representing P(E).
Recall that    P(S) = 1
        and
                    P(E) = 1 - P().
The reasoning is that P(E) is represented by the area of the circle and P() is the probability of the events that are outside the circle.

Venn diagram for the Probability and complement





















Empirical Probability:
   The difference between classical and empirical probability is that classical probability assumes that certain outcomes are equally likely (such as the outcomes when a die is rolled), while empirical probability relies on actual experience to determine the likelihood of outcomes.

      In empirical probability, one might actually roll  a given die 6000 times, observe  the various frequencies and use these frequencies to determine the probability of an outcome.

    Suppose, for example, that  a researcher asked25 people if they liked the taste of a new soft drink. The response was classified as "yes," "no," "undecided.The results were categorized in a frequency distribution, as shown.
Response
Frequency
Yes
15
No
8
Undicided
2
Total
25

Probabilities now can be compared for   various categories,
For example,
the probability of selecting a person who liked the taste 15/25 or  3/5,  Since 15 out of 25 people in the survey answered "yes. 

Formula for Empirical Probability:
 
 Given a frequency distribution, the probability of an event being. in a class is,

P(E)  =        frequency  for the class / total frequenčies in the distribution

This probabiity observation. is called empirical probability and is based on observation.


Fxample 1:
In the soft-drink survey just described, find the probability that a person responded "no."

Solution:

      P(E)  =  f/n = 8/25

Note: This is the same as relative frequency.

Fxample 1:
In a sample of 50 people, 21 had type O blood, 22 had type A blood, 5 had type B blood, and 2 had type AB blood. Set up a frequency distribution and find the following probabilities:

a. A person has type O blood.

b. A person has type A or type B blood.

C. A person has neither type A nor type O blood.

d. A person does not have type AB blood.

Solution:

Type
Frequency
A
22
B
5
AB
2
Total
50

a.   P(O) = f/n =   21/50

b.  P(A or B) =22/50 + 5/50 = 27/50
(Add the frequencies of the two classes.)

c.  P(neither A nor O) =  5/50 + 2/50 = 7/50

    (Neither A nor O means that a person has either type B or type AB blood.)


d.  P(notAB) = 1- P(AB) =1-2/50 =48/50 = 24/25

(find the probability of not AB by subtracting the probability of type AB from l.



Example 3:
Hospital records indicated that maternity patients stayed in the hospital for the number
of days shown in the distribution.

Number of days stayed
Frequency
3
15
4
32
5
56
6
19
7
5
Total
127

Find these probabilities.
a. A patient stayed exactly 5 days                      

b. A patient stayed less than 6 days.

c. A patient stayed at most 4 days.

d. A patient stayed at least 5 days.

Solution:

a.    P(5)= 56/127

b.    P(less than 6 days) = 15/127 + 32/127 +56/127 = 103/127

  (Less than 6 days means either 3, 4, or 5 days.)


c.     P(at most 4 days) = 15/127 +32/127 = 47/127
   
    (At most 4 days means 3 or 4 days.)

d.      P(at least 5 days)  = 56/127 +19/127 + 5/127 = 80/127

    (At least 5 days means either 5, 6, or 7 days.)



Empirical probabilities can also be found using relative frequency distribution, as shown in

For example, the relative frequency distribution  of the soft-drink data shown before is

Number of days stayed
Frequency
Relative Frequency
Yes
15
0.60
No
8
0.32
Undecided
2
0.08
Total
25
1.00

Hence, the probability that a person responded "no" is 0.32, which is equal to 8/25.


Law of Large Numbers:
When a coin is tossed one time, it is common knowledge that the probability getting a head is 1/2 . But what happens  when the coin is tossed 50 times? Will it  come up heads 25 times? Not all of the time.  One should expect about   25 heads if the coin  is fair.But due to chance variation  25 heads will not occur most of the time.

     If the empirical probability of getting a head is computed using a small number of trials, it is Usually not exactly 1/2. However, as the number of trials increases, the empirical probability of getting a head will approach the theoretical probability 1/2, if in fact the coin is fair (i.e., balanced). This phenomenon is an example of the law of large numbers.  In other words, if one tosses a coin enough times, the number of  heads and tails will tend to "even out." This law holds for any type of gambling game -tossing dice, playing roulette, and so on.

       It should be pointed out that the probabilities that the proportions steadily approach
may or may not agree with those theorized in the classical model.If  not, it can have important implications, such as "the die is not fair." Pit bosses in Las Vegas watch for empirical trends that do not agree with classical theorie, and they will sometimes take a set of dice out of play if observed frequencies are too far out of line with classical expected frequencies.


Subjective Probability:
    The third type of probability is called subjective probability. Subjective probability
uses a probability value based on an educated guess or estimate, employing opinions
and inexact information.
      In subjective probability, a person or group makes an educated guess at the chance
that an event will occur. This guess is based on the person's experience and evaluation
of a solution.
For example,
     A sportswriter may say that there is a 70% probability that the india will win the world cup next year.
    A physician might say that, on the basis of her diagnosis, there is a 30% chance the patient will need an operation.
    A seismologist might say there is an 80% probability that an earthquake will occur in a certain area.
These are only a few examples of how subjective probability is used in everyday life

Remark: All three types of probability (classical, empirical, and subjective) are used to solve
a variety of problems in business, engineering, and other fields.



          Leave Comments for Suggestion
                   Keep Sharing 
" You will loss nothing by lighting others candle"
                    A R Statistics

mehrere bunte Aktenordner dessen Ordnerrücken beschriftet sind mit Statistik stehend vor hellem Hintergrund











Comments

Popular posts from this blog

Theory of Attributes- Basic concept and their applications:

MCQ's OF VITAL STATISTICS

A.R Statistics