TIA Exam 2 Problem 25

samedi 21 mars 2015

The computer network for a company has two servers. The time until one server fails is uniformly distributed on [0,20], and the time until the other server fails is uniformly distributed on [0,30]. What is the variance of the time until one of the servers fails?



--------



I draw a picture. X goes from 0 to 20, Y goes from 0 to 30. I draw the line Y=X. Everything below this line means Y is less than X and is therefore the minimum. Everything above this line means X is less than Y and is therefore the minimum. The density in this region is 1 / 20 / 30, but that will get complicated, so I divide everything by 10 for now. Therefore, the density is 1 / 2 / 3 = 1 / 6. Hence, the expectation of the minimum is:



integral from 0 to 2, integral from 0 to X, of y / 6 dy dx plus integral from 0 to 2, integral from x to 3, x / 6 dy dx



I get the expectation of the minimum is therefore 7.77, after you multiply the above by the factor of 10 we divided by. Here's where the confusion comes in: this is the expectation that is given in the solution for this part. But apparently this method is incorrect. Because when I then try to get the expectation squared:



integral from 0 to 2, integral from 0 to X, of y^2 / 6 dy dx plus integral from 0 to 2, integral from x to 3, x^2 / 6 dy dx



I get 2, so multiply by 10 we get 20. Therefore the variance is 20 - 7.77 ^ 2 which is negative and the universe implodes



So, I somehow used a 100% incorrect method and arbitrarily found the CORRECT expectation, but INCORRECT expectation squared??





TIA Exam 2 Problem 25

0 commentaires:

Enregistrer un commentaire

 

Lorem

Ipsum

Dolor