An object of unknown mass is dropped from rest from a height of 100 m above ground level. How long does it take the object to reach the ground?

Given data
*The given height is h = 100 m
*The value of the acceleration due to gravity is g = 9.8 m/s^2
*The initial speed of an object is u = 0 m/s
The formula for the time taken by the object to reach the ground is given by the equation of motion as
[tex]h=ut+\frac{1}{2}gt^2[/tex]Substitute the known values in the above expression as
[tex]\begin{gathered} 100=(0)t+\frac{1}{2}(9.8)t^2 \\ t^2=20.40 \\ t=\sqrt[]{20.40} \\ =4.52\text{ s} \end{gathered}[/tex]Hence, the time taken by the object to reach the ground is t = 4.52 s