Radio signals travel at a rate of 3 x 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 9.6 x 10^6 meters? (Hint: Time is distance divided by speed)

A. 3.2 x 10^2 seconds
B. 3.2 x 10^-2 seconds
C. 3.13 x 10^1 seconds
D. 2.88 x 10^15 seconds

Respuesta :

Time = Distance / Speed   [ Already mentioned ]

Then, substitute the known values, 

t = (9.6 × 10⁶) / (3 × 10⁸)

t = 3.2 × 10⁻² s

In short, Your Answer would be Option B

Hope this helps!

Answer:

B. 3.2 x 10^-2 seconds

Step-by-step explanation:

ACCESS MORE