+0  
 
0
47
1
avatar

A baseball is thrown from a height of 5 feet.  The height, h of the ball at time t seconds is modeled by the equation h(t) = -16t^2 + 100t + 5.  How long will it take the ball to reach the ground?

 

 Apr 20, 2020
 #1
avatar+307 
+2

Well, we just make h(t) 0 since it reprsents the height.

 

-16t^2 + 100t + 5 = 0

 

Using the Quadratic Formula, 

 

We get the value of t as 25/8 +/- sqrt(645)/8.

 

The two solutions approximated are -0.05 seconds and 6.23 seconds. We take the positive solution in this case. It takes 6.23 seconds for the ball to reach the ground.

 Apr 20, 2020

21 Online Users

avatar
avatar