A person invests 2000 dollars in a bank. The bank pays 4.75% interest compounded quarterly. To the nearest tenth of a year, how long must the person leave the money in the bank until it reaches 3500 dollars?
PV=2000; FV=3500; P=0;R=0.0475/4; a= (log(FV/PV) )/log(R + 1);print"N =",a/4
N = 11.85 =11.9 years.