The heights of the apple trees in an orchard are normally distributed with a mean of 12.5 feet and a standard
deviation of 1.2 feet. What percentage of the apple trees are between 10.1 and 13.7 feet tall?

Respuesta :

Answer:

18.141%

Step-by-step explanation:

We start by calculating the z-scores

Mathematically;

z-score = (x-mean)/SD

for 10.1

z = (10.1-12.5)/1.2 = -2

for 13.7

z = (13.7-12.5)/1.2 = 1.2/1.2 = 1

So we need the probability within this range

P(-2 < x < 1)

We can check the probability using the standard normal distribution table

That will be;

P(-2 < x < 1) = 0.18141

Converting this to percentage, we have 18.141%