Billy shoots an arrow from 10 feet above the ground. The height of this arrow can be expressed by the equation $h=10-23t-10t^2$, where $t$ is time in seconds. If the center of a target is raised 5 feet off the ground, in how many seconds must the arrow reach the target in order for Billy to hit the bulls eye?
0 = 5 - 23t - 10t^2 rearrange as
10t^2 + 23t - 5 = 0
(5t - 1) ( 2t + 5) = 0
Setting both factors to 0 and solving for t, we get
t = 1/5 sec = 0.2 sec or t = -5/2 sec = -2.5 sec
It's obvious that the first answer is correct