Artificial intelligent assistant

Convergence of $a_n= (7/10+7/10i)^n$ I am new to real analysis and want to prove that $$a_n=\Bigg(\frac{7}{10}(1+i)\Bigg)^n$$ is convergent for $n\rightarrow \infty$. This is what I have done: $a_n= (7/10+7/10i)^n \\\ = (49/100+2\cdot49/100i-49/100)^n\\\ = (98/100i)^n\\\ = \Bigg(\frac{-9604}{10000}\Bigg)^n$ By multiplying some of the terms it seems that the sequence converges to $0$, because $\frac{-9604}{10000}$ is less than $1$. Is this a correct approach? I'm not sure if I am allowed to treat sequences this way. My book only convers sequences very briefly. Thank you

You seem to have a somewhat right idea, but the computation is not correct. Verify that $$c:=\left|\frac7{10}(1+i)\right|=\frac7{10}\sqrt{1^2+1^2}=\frac{7\sqrt 2}{10}<1$$ because $\left(\frac{7\sqrt 2}{10}\right)^2=\frac{98}{100}<1$. Then $|a_n|=|a_1^n|=|a_1|^n=c^n\to 0$.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 87c5da3efbfd22c37944d4a0e580a9b8