\(\sum _{k=1}^{\infty }\:\frac{1}{3^k+ln^2\left(10k\right)}\)

I need help determining if the sum/row converges or diverges. If it could be explaned step by step and what method should be used would be much helpful. Thank you in advanced.

Guest May 27, 2021

#1**+1 **

I believe if you look at the denominator ....and each subsequent denominator term is larger than the last , the series converges

pick a number n= 3 denominator = 29.18

next number in sequence n= 4 denominator = 83.57 SO I think it converges.....I think to value .5

ElectricPavlov May 27, 2021

#2**0 **

Sorry if i wasn't thorough enough with my question it needs proof of it converging or diverging based on the n th number in the row by using some of the test of convergence( comparison test, Limit test,quotient test Dalamber criterion, root test Cauchy criterion) im just not sure where to start.

Hope you see this im new to the site and not sure if im responding correctly.

Guest May 27, 2021