Answer:
[tex]5\:\mathrm {s}[/tex]
Explanation:
We can use kinematics equation [tex]\Delta y={v_i}t+\frac{1}{2}at^2[/tex] to solve this problem. Since [tex]v_i[/tex] in the vertical direction is [tex]0[/tex], we have:
[tex]\Delta y =\frac{1}{2}at^2[/tex] (freefall equation)
Plugging in values, we get:
[tex]100=\frac{1}{2}\cdot 9.81\cdot t^2,\\\\t^2=\frac{100}{\frac{1}{2}\cdot 9.81},\\\\t=\sqrt{20.387}\\\\t=4.515\approx \fbox{$5\:\mathrm{seconds}$}[/tex](one significant figure).
*The horizontal velocity is irrelevant in this question. It only affects the horizontal displacement of the object (where the object lands), not how long it takes for the object to hit the ground.