A person invests 6500 dollars in a bank. The bank pays 7% interest compounded

quarterly. To the nearest tenth of a year, how long must the person leave the money

in the bank until it reaches 20100 dollars?

A= P(1 + -]nt

Respuesta :

Answer:

t=16.2 years

Step-by-step explanation:

A=p(1+r/n)^nt

A=$20100

P=$6500

r=7%=0.07

n=4

t=?

t=ln(A/P)/n {ln(1+r/n)}

=ln(20100/6500) / 4{ln(1+0.07/4)}

=ln(3.0923)/4{ln(1+0.0175)}

=ln(3.0923)/4{ln(1.0175)}

=1.1289/4(0.0174)

=1.1289/0.0696

=16.23

To the nearest tenth

t=16.2 years