+0  
 
0
658
2
avatar

i need help on finding the constant for this problem.

 

 

Dasan’s mom takes him to the video arcade for his birthday. In the first 10 minutes, he spends $3.50 playing games. If his allowance for the day is $20.00, how long can he keep playing games before his money is gone?

 Oct 19, 2015
 #1
avatar
0

About 5 minutes

 Oct 19, 2015
 #2
avatar+118723 
+5

 

Hi,

 

This is a bit tricky isn't it.    frown

 

Cost per minute is $3.50/10minutes = $0.35     (35cents)

 

How many times does 35cents go into $20 ?

 

20 divided by 0.35 = 57.14   

So he can play for a total of 57 and a bit minutes.

 

This is not the final answer.

What else do you have to do?

 Oct 19, 2015
edited by Melody  Oct 19, 2015

0 Online Users