I drove in the beach at a rate of dollar40dollar miles per hour. If I had driven at a rate of dollar30dollar miles per hour instead, then I would have arrived dollar20dollar minutes later. How many miles did I drive?
To solve the problem, we can use the formula: distance = rate x time. Let's call the distance of the trip "d". The time it took to drive at 40 mph can be represented as t. The time it would have taken to drive at 30 mph can be represented as t + 20 minutes, or t + (20/60) hours.
From the first equation, we have: d = 40t
From the second equation, we have: d = 30(t + (20/60))
We can now set the two equations equal to each other:
40t = 30(t + (20/60)) Expanding the second equation: 40t = 30t + (30 * 20/60) Simplifying: 40t = 30t + 30
Solving for t: 10t = 30 t = 3
So the trip took 3 hours. Now we can use the first equation to find the distance:
d = 40t = 40 * 3 = 120 miles
You got your equation all mixed up:
40T = 30(T + 20/60), solve for T
T = 1 hour.
From left-side of the equation, you have:
1 hour x 40 miles =40 miles - distance to the beach.
Let be the distance to the beach, in miles. Then the time it took to drive to the beach, at 40 miles per hour, is d/40 (in hours).
If I had driven at 30 miles per hour instead, then it would take me d/30 hours. Note that 1 hour is equivalent to 60 minutes, so 20 minutes is equivalent to of an hour. Therefore,
d/40=d/30-1/3
Multiplying both sides by 120 to get rid of the fractions, we get
3d=4d-40
so, d=40. As a check, we see that at 40 mph it took me 1 hour to drive 40 miles, and at 30 mph it would have taken me 1 hour and 20 minutes, as required.
Hope this helps.