A rock is dropped from a height of 100 feet. Calculate the time between when the rock was dropped and when it landed. If we choose "down" as positive and ignore air friction, the function is

Respuesta :

Answer: 2.49 s

Step-by-step explanation:

Given

Rock is dropped from a height of [tex]h=100\ ft[/tex]

Using the equation of motion [tex]h=ut+\frac{1}{2}at^2[/tex]

here, [tex]u=0[/tex] (Rock is dropped)

[tex]a=g[/tex]

Putting values

[tex]\Rightarrow 100=0+\dfrac{1}{2}gt^2\\\\\Rightarrow t^2=\dfrac{2h}{g}=\dfrac{2\times 100}{32.2}\\\\\Rightarrow t=\sqrt{6.211}=2.49\ s[/tex]

ACCESS MORE