Artificial intelligent assistant

Proof that length of a finite cover for a real interval $[a,b]$ is greater than $b-a$ whenever $b>a$ ![enter image description here]( In the above proof, i could not understand the inequality in the bottom part. Could anyone show me explicitly why it is true? (I understand about the sum be equal to b). Also do anyone agree that there is a mistake in the thir line of the proof? each $v(U_i)$ is the sum o certain $t_j-t_{j-k}$ where $k>1 $ or equal one

The intuition of this proof is very simple, and you should be able to understand it from the given text. For a more "formal" approach set $$\epsilon_{ij}:=\left\\{\eqalign{1\qquad&\bigl(\>[t_{j-1},t_j]\subset U_i\bigr)\>,\cr 0\qquad &({\rm otherwise})\ .\cr}\right.$$ As the $U_i$ form a cover of $[a,b]$ one has $\sum_i\epsilon_{ij}\geq1$ for all $j$. On the other hand $$\
u(U_i)=\sum_j\epsilon_{ij}(t_j-t_{j-1})\ .$$ It follows that $$\sum_i \
u(U_i)=\sum_{i,j}\epsilon_{ij}(t_j-t_{j-1})\geq \sum_j(t_j-t_{j-1})=b-a\ .$$

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy 9e4c5345a5c13197ede409ac82c36933