Artificial intelligent assistant

show that$v(E) = a_1a_2a_3....a_nv(B^n)$ I'm generally pretty good a change in variable type problems, but this one has me stumped. It's on page 264 in Advanced calculus of several variables by Edwards. Thm 5.1: If $\lambda:R^n \rightarrow R^n$ is a linear mapping and $B \subset R^n$ is contented, then $\lambda(B)$ is also contented, and $v(\lambda(B)) = |det\lambda|v(B)$ Here is the question: Consider the n-dimensional solid ellipsoid $$E = \\{x \in R^n: \sum_{i=1}^k \frac{x_i^2}{a_i^2}\\}$$ note that $E$ is the image of the unit ball $B^n$ under the mapping $ T: R^n \rightarrow R^n$ defined by $$T(x_1,....,x_n) = (a_1x_1,.....,a_nx_n)$$ apply thm 5.1 to show that $v(E) = a_1a_2a_3....a_nv(B^n)$ thanks guys

You need to show that $|\det T|=a_1a_2\cdots a_n$. You can do this by imposing an orthonormal basis on $\mathbb{R}^n$, and taking the determinant of that matrix. The standard basis will do, and $T$ sends it to the diagonal matrix with diagonal entries $a_1, a_2, \ldots, a_n$, which has determinant $a_1a_2\cdots a_n$, as desired.

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy f7cd4ecbfde5c7f96046509f7272c8d2