Answer:
Δt = 1.1 s
Explanation:
Given information:
H= 50.0 m
g= 9.8 m/s²
[tex]H = \frac{1}{2} * g* t^{2}[/tex]
Solving for t, we get:
t₁= 3.2 s
We can find the final velocity for the object when it hits the ground, using the following expression:
[tex]v_{f}^{2} - v_{o}^{2} = 2*g*H[/tex]
Solving for vf, we get:
vf = 33.9 m/s
Applying the definition of acceleration, being this acceleration the one due to gravity (g), we can write the following equation:
[tex]v_{f} = v_{o} + g*t[/tex]
(Assuming the downward direction to be positive).
Solving for t, we get:
t₂ = 2.1 s
So the difference in time when both objects hit the ground, it's simply
Δt = t₂ - t₁ = 3.2 s - 2.1 s = 1.1 s