# Randaom Variables Assignment Help Help With Assignment

of 6 ## Content

Random Variables A Random Variable over a sample space is a function that maps every sample point or outcome to a real number. Numerical values are assigned to outcomes in many experiments. A random variable is a function that assigns a value, essentially a numeric n umeric value to all possible outcomes of a random experiment. These numbers are called values of the random variable. These are often denoted by using X or Y. A random variable is not a variable but rather a function that maps events or outcomes to numbers. Suppose, if a coin is tossed three times in a row and the p possible ossible outcomes are noted, then the possible outcomes for the experiment will be S = {HHH, HHT, HTH, HTT, THH, TTT}. If X is taken as the random variable to the number of heads, then SX = {0, 1, 2, 2 , 3}. Discrete Random Variables A discrete random variable can take only finite, specific, isolated numeric values, like the outcome of a roll of dice, or the number of dollars in a randomly chosen bank account. Continuous Random Variables A continuous random variable on the other hand can take infinite values, like the temperature of a room for one continuous day or the height he ight of a man in centimeters. Sometimes, it is necessary to add or subtract random variables. When this occurs, it is useful to know the mean and variance of the result. Sums and Differences of Random Variables: Effect on the Mean Suppose you have two variables: X with a mean of x and Y with a mean of  y. Then, the mean of the sum of these variables  x+y and the mean of the difference between these variables x-y  are given by the following equations. x+y = x + y

and

x-y = x - y

The above equations for general variables also apply to random variables. If X and Y are random variables, then E(X + Y) = E(X) + E(Y)

and

E(X - Y) = E(X) - E(Y)

where E(X) is the expected value (mean) of X, E(Y) is the expected value of Y Y,, E(X + Y) is the expected value of X plus Y, and E(X - Y) is the expected value of X minus Y. Independence of Random Variables If two random variables, X and Y, are independent, they satisfy the following conditions.

P(x|y)

P(x 

= P(x), for all values of X and an d Y.

y) = P(x) * P(y), for all values of X and Y.

The above conditions are equivalent. If either one is met, the other condition also met; and X and Y are independent. If either condition is not met, X and Y are dependent. Note: If X and Y are independent, then the correlation between X and Y is equal to zero. Sums and Differences of Random Variables: Effect on Variance Suppose X and Y are independent random variables. Then, the variance of (X + Y) and the variance of (X - Y) are described by the following equations Var(X + Y) = Var(X - Y) = Var(X) + Var(Y) where Var(X + Y) is the variance of the sum of X and Y, Var(X - Y) is the variance of the difference between X and Y, Var(X) is the variance v ariance of X, and Var(Y) is the variance of Y. Note: The standard deviation (SD) is always equal to the square root of the variance (Var). Thus, SD(X + Y) = sqrt[ Var(X + Y) ] Problem

X 0

1

2

and

SD(X - Y) = sqrt[ Var(X - Y) ]

3

0.1

0.2

0.2

4

0.1

0.2

0.2

Y

The table on the right shows the joint probability distribution between two random variables X and Y. (In a joint jo int probability distribution table, numbers in the cells of the table represent the probability that particular values of X and Y occur together.) What is the mean of the sum of X and Y? (A) 1.2 (B) 3.5 (C) 4.5 (D) 4.7 (E) None of the above. Solution

The correct answer is D. The solution requires three computations: (1) (1) find the mean (expected value) of X, (2) find the t he mean (expected value) of Y, and (3) find the sum of the means. Those computations are shown below, beginning with the mean of X. E(X) =  [ xi * P(xi) ] E(X) = 0 * (0.1 + 0.1) + 1 * (0.2 + 0.2) + 2 * (0.2 + 0.2) = 0 + 0.4 + 0.8 = 1.2 Next, we find the mean of Y. E(Y) =  [ yi * P(yi) ] E(Y) = 3 * (0.1 + 0.2 + 0.2) + 4 * (0.1 + 0.2 + 0.2) = (3 * 0.5) + (4 * 0.5) = 1.5 + 2 = 3.5 And finally, the mean of the sum of X and Y is equal to the sum of the means. mea ns. Therefore, E(X + Y) = E(X) + E(Y) = 1.2 + 3.5 = 4.7 Note: A similar approach is used to find differences between means. The difference between X and Y is E(X - Y) = E(X) - E(Y) = 1.2 - 3.5 = -2.3; and the difference between Y and X is E(Y - X) = E(Y) - E(X) = 3.5 - 1.2 = 2.3

Problem

2

The table on the left shows the joint probability distribution between two random variables - X and Y; and the table on the right shows the joint j oint probability distribution between two random variables - A and B.

X

3

A

0

1

2

0.1

0.2

0.2

Y

0

1

2

3

0.1

0.2

0.2

4

0.2

0.2

0.1

B 4

0.1

0.2

0.2

Which of the following statements are true? I. X and Y are independent in dependent random variables. II. A and B are independent random variables. (A) I only (B) II only (C) I and II (D) Neither statement is true. (E) It is not possible to answer this question, based on the information given. Solu o

ti n

The correct answer is A. The solution requires several computations to test the independence independenc e of random variables. Those computations are shown below. X and Y are independent if P(x|y) = P(x), for all values of X and Y. From the probability distribution table, we know the following: P(x=0)

= 0.2;

P(x=0

| y=3) = 0.2;

P(x=0

| y = 4) = 0.2

P(x=1)

= 0.4;

P(x=1

| y=3) = 0.4;

P(x=1

| y = 4) = 0.4

P(x=2)

= 0.4;

P(x=2

| y=3) = 0.4;

P(x=2

| y = 4) = 0.4

Thus, P(x|y) = P(x), for all values of X and Y, which means that X and Y are independent. We repeat the same analysis to test the independence of A and B. P(a=0)

= 0.3;

P(a=0

| b=3) = 0.2;

P(a=0

| b = 4) = 0.4

P(a=1)

= 0.4;

P(a=1

| b=3) = 0.4;

P(a=1

| b = 4) = 0.4

P

P

(a=2) = 0.3;

(a=2 | b=3) = 0.4;

P

(a=2 | b = 4) = 0.2

Thus, P(a|b) is not equal to P(a), for all values of A and B. For example, P(a=0) = 0.3; but P(a=0

| b=3) = 0.2. This means that A and B are not independent.

Problem

Suppose X and Y are independent random variables. The variance of X is equal to 16; and the variance of Y is equal to 9. Let Z = X - Y. What is the standard deviation of Z? (A) 2.65 (B) 5.00 (C) 7.00 (D) 25.0 (E) It is not possible to answer this question, based on the t he information given. Solution

The correct answer is B. The solution requires us to t o recognize that Variable Z is a combination of two independent random variables. As such, the variance of Z is equal to the variance v ariance of X plus the variance of Y. Var(Z) = Var(X) + Var(Y) = 16 + 9 = 25 The standard deviation of Z is equal to the square root of the variance. Therefore, the standard deviation is equal to the square root of 25, which is 5.

A Sample Question

The lifetime of a certain model of light bulb are dis distributed tributed normally with mean 1100 hours and variance 625. What's the chance that a randomly picked light bulb lasts over 1050 hours? ANSWER Let X be the lifetime of a light bulb in hours. We are told X~N(1100,625). We want p(X > 1050). To solve this, standardize and then look up probability in the standard normal table. p(X>1050) = p(Z > (1050-1100)/625) = p(Z > -2) = p(Z < 2) where the last equality uses the fact that the normal pdf is symmetrical. Finally, p(Z < 2) = 0.9772. Given random variable X is binomially distributed with n=100 and p=0.4, use a normal approximation approxima tion to find (a) p(X  50) (b) p(X=40) ANSWER We are told X~Bin(100,0.4). Using the normal approximation X behaves like Y, a normally distributed random variable with mean np = 100*0.4 = 40, and variance np(1-p) = 100*0.4*0.6 = 24. So Y ~ N(40,24) (a) The main step is the normal approximation, approximation, p(X  50)  p(Y > 49.5). And now we calculate the probability p robability using standardization: p(Y > 49.5) = p(Y > 1.939) = 0.026. (b) Appling the normal approximation to the point probability p(X = 40)  p(39.5 < Y < 40.5) Then standardizing: p(39.5 < Y < 40.5) = p(-0.102 < Z < 0.102) = 0.081

## Recommended

Or use your account on DocShare.tips

Hide