806
This is a set of problems with which you can take exercise on linear algebra.
Day 1.
The problems for the first day are related to: Representation of Linear Transformations.
- Explain why every linear transformation between finite dimensional vector spaces can be regarded as a matrix. (Hint: Consider bases for domain and codomain of the transformation.)
- Use matrix multiplication to obtain the formula for \(\sin(\alpha + \beta)\) and \(\cos(\alpha + \beta ).\) (Hint: Consider the transformation of rotation.)
- Let \(F : \mathbb{R}^2 \rightarrow \mathbb{R}^2\) be a linear transformation defined by \(F(x,\,y) = (2x+3y ,\, 4x-5y).\) Find the matrix representation of \(F\) relative to the basis \(S\) consisting of \(u_1 = ( 1,\,2)\) and \(u_2 = (2,\,5).\)
(See: Schaum's outlines p.196 Example 6.1.) - Let \(F : \mathbb{R}^2 \rightarrow \mathbb{R}^2\) be a linear transformation defined by \(F(x,\,y) = (2x+3y ,\, 4x-5y).\) Find the matrix representation of \(F\) relative to the basis \(S\) consisting of \(u_1 = ( 1,\,-2)\) and \(u_2 = (2,\,-5).\)
(See: Schaum's outlines p.197 Example 6.3.) - Consider the following two bases of \(\mathbb{R}^2 :\)
\[S = \left\{ u_1 = (1,\,2) ,\,\, u_2 = (3,\,5)\right\} ,\,\, S ' = \left\{ v_1 = (1,\,-1) ,\,\, v_2 = (1,\,-2) \right\}.\]
Find the transition matrix \(P\) from \(S\) to \(S ' .\)
(See: Schaum's outlines p.200 Example 6.5.) - Consider the following two bases of \(\mathbb{R}^3 :\) \[\begin{align} E &= \left\{ e_1 = (1,\,0,\,0),\,\, e_2 = (0,\,1,\,0) ,\,\, e_3 = (0,\,0,\,1) \right\} , \\[5pt] S &= \left\{ u_1 = (1,\,0,\,1),\,\, u_2 = (2,\,1,\,2) ,\,\, u_3 = (1,\,2,\,2) \right\}. \end{align}\] Find the transition matrix \(P\) from \(E\) to \(S,\) and find the transition matrix \(Q\) from \(S\) to \(E.\) (See: Schaum's outlines p.200 Example 6.6.)
- Read the ‘Remark 1’ and ‘Remark 2’ on p.201 of Schaum's outlines, and explain why you are not so surprised. (Hint: Keep calm and call ‘them’.)
- Let \(V\) be a vector space and \(V^{**}\) be the double dual space of \(V.\) Explain why they say \(V\) and \(V^{**}\) are canonically isomorphic. (Hint: Forget basis.)
- Let \(V\) and \(W\) be finite dimensional vector spaces, \(T : V \rightarrow W\) be a linear transformation, and \(T^*\) be the transpose map of \(T.\) Prove:
(a) If \(T\) is one to one, then \(T^*\) is onto.
(b) If \(T\) is onto, then \(T^*\) is one to one.
(Hint: Keep calm and consider the definitions of ‘transpose’, ‘one to one’ and ‘onto’.) - Let \(A,\) \(B\) be \(n\times n\) matrices with \(A \ne B.\) Show that \(A\) is similar to \(B\) if and only if there exist a vector space \(V\) and an endomorphism \(T\) of \(V\) such that both \(A\) and \(B\) represent \(T\) with respect to different bases. (Hint: Consider the definition of ‘similar matrices’.)
Day 2.
The problems for the second day are related to Inner Product Spaces.
- Find \(\langle f ,\, g \rangle,\) \(\langle f,\,h\rangle\) and \(\lVert f \rVert\) for \[f(t) = t+2 ,\,\, g(t) =3t-2 ,\,\, h(t) = t^2 -2t-3\] where \[\langle f,\,g\rangle = \int_0^1 f(t) g(t) dt.\] (See: Schaum's outlines p.244 Problem 7.5.)
- Apply the Gram-Schmidt orthogonalization process to find an orthonormal basis for the subspace \(V\) of \(\mathbb{R}^4\) spanned by \[v_1 = (1,\,1,\,1,\,1),\,\, v_2 = (1,\,2,\,4,\,5) ,\,\, v_3 = (1,\,-3,\,-4,\,-2).\] (See: Schaum's outlines p.236 Example 7.10.)
- Let \(V\) be an inner product space and \(u_1,\) \(u_2,\) \(\cdots,\) \(u_n\) be an orthonormal basis. Show that for \(v\in V,\) \[v = \sum_{j=1}^n \langle v\,\vert\, u_j \rangle u_j .\] In this case, \(\langle v \,\vert\, u_j \rangle\) are called Fourier coefficients. One of the important process to develop the Fourier theory is to show that this formula is valid for the Schauder bases.
- Let \(V\) be a finite inner product space, and \(W\) be a subspace of \(V.\) Show that \(V = W \oplus W^\bot.\)
- Show that the parallelogram law holds in inner product spaces:
\[\lVert u+v \rVert^2 + \lVert u-v \rVert^2 = 2 \lVert u\rVert^2 + 2\lVert v\rVert^2.\]
(See: Schaum's outlines p.245 Problem 7.7.)
In general, not every normed linear space is an inner product space. But if a normed linear space satisfies the parallelogram law, then the norm can be arise in the usual way from some inner product: \(\langle x \,\vert\, y \rangle = \frac{1}{4} ( \lVert x+y \rVert^2 - \lVert x-y \rVert^2 ).\) - Suppose that \(S_1,\) \(S_2\) and \(S_3\) are subsets of \(V.\) Prove that:
(a) \(S\subseteq S^{\bot \bot}.\)
(b) If \(S_1 \subseteq S_2 ,\) then \(S_2 ^{\bot} \subseteq S_1 ^{\bot}.\)
(c) \(S^\bot = (\operatorname{span} (S))^\bot.\)
(See: Schaum's outlines p.247 Problem 7.14.) - Show that
\[A = \left[\begin{array}{cc} a & b \\ b & d \end{array}\right]\]
is positive definite if and only if
\[a > 0, \,\, d > 0 \,\, \text{and} \,\, ad-b^2 > 0.\]
(See: Schaum's outlines p.255 Problem 7.43.)
This means that a \(2\times 2\) symmetric matrix \(A\) is positive definite if and only if all eigenvalues of \(A\) are positive. This statement is true for any square matrix. - Verify that the following defines an inner product in \(\mathbb{R}^2.\)
\[\langle x ,\, y \rangle = x_1 y_1 - x_1 y_2 - x_2 y_1 + 3x_2 y_2\]
where \(x = (x_1 ,\, x_2 )\) and \(y = (y_1 ,\, y_2 ).\)
(See: Schaum's outlines p.244 Problem 7.3.) - Prove: Suppose \(w_1,\) \(w_2,\) \(\cdots,\) \(w_r\) form an orthogonal set of nonzero vectors in \(V.\) Let \(v\) be any vector in \(V\) and let \(c_i\) be the component of \(v\) along \(w_i,\) that is, \(c_i w_i\) is the vector projection of \(v\) onto \(w_i.\)
Then, for any scalars \(a_1,\) \(a_2,\) \(\cdots,\) \(a_r,\) we have
\[\left\lVert v- \sum_{k=1}^r c_k w_k \right\rVert \le \left\lVert v-\sum_{k=1}^r a_k w_k \right\rVert .\]
That is, \(\sum c_i w_i\) is the best approximation to \(v\) as a linear combination of \(w_1,\) \(w_2 ,\) \(\cdots ,\) \(w_r .\)
(See: Schaum's outlines p.251 Problem 7.30. Or, see: 내적공간 Corollary 11.) - Let \(V\) be a vector space with a complex inner product \(\langle \,\vert\, \rangle.\) Explain why the conjugation in the formula \[\langle v\,\vert\,w \rangle = \overline{\langle w\,\vert\,v \rangle}\] is necessary, instead of \[\langle v\,\vert\,w \rangle = \langle w\,\vert\,v \rangle.\] (Hint: Consider the simplest case \(\lVert v \rVert\) for \(v\in V = \mathbb{C}.\))
Day 3.
The problems for the third day are related to Determinants.
- Evaluate the determinant of each of the following matrices. \[ A = \left[\begin{array}{ccc} 2&3&4 \\ 5&4&3 \\ 1&2&1 \end{array}\right], \,\,\, B = \left[\begin{array}{crr} 1&3&-5 \\ 3&-1&2 \\ 1&-2&1 \end{array}\right] . \] (See: Schaum's outlines p.276 Problem 8.1.)
- Evaluate the determinant of each of the following matrices. \[ A = \left[\begin{array}{rrrr} 2 & 5 & -3 & -2 \\ -2 & -3 & 2 & -5 \\ 1 & 3 & -2 & 2 \\ -1 & -6 & 4 & 3 \end{array}\right], \,\,\, B = \left[\begin{array}{rrrrr} 6 & 2 & 1 & 0 & 5 \\ 2 & 1 & 1 & -2 & 1 \\ 1 & 1 & 2 & -2 & 3 \\ 3 & 0 & 2 & 3 & -1 \\ -1 & -1 & -3 & 4 & 2 \end{array}\right] . \] (See: Schaum's outlines p.277 Problem 8.4.)
- Use the adjoint matrix to find the inverse matrix of \[ B = \left[\begin{array}{ccc} 1 & 1 & 1 \\ 2 & 3 & 4 \\ 5 & 8 & 9 \end{array}\right] . \] (See: Schaum's outlines p.278 Problem 8.6.)
- Consider the system \[\begin{cases} kx + y + z = 1 \\[5pt] x + ky + z = 1 \\[5pt] x + y + kz = 1. \end{cases}\] Use determinant to find those values of \(k\) for which the system has (i) a unique solution; (b) more than one solution; (c) no solutions. (See: Schaum's outlines p.280 Problem 8.10.)
- Find the determinant of the matrix \[M=\left[\begin{array}{ccccc} 3 & 4 & 0 & 0 & 0 \\ 2 & 5 & 0 & 0 & 0 \\ 0 & 9 & 2 & 0 & 0 \\ 0 & 5 & 0 & 6 & 7 \\ 0 & 0 & 4 & 3 & 4 \end{array}\right].\] (See: Schaum's outlines p.280 Problem 8.12.)
- Find the determinant of \(F : \mathbb{R}^3 \rightarrow \mathbb{R}^3\) defined by \[F(x,\,y,\,z) = (x+3y-4z ,\, 2y+7z ,\, x+5y-3z),\] and explain that \(F\) is an isomorphism. (See: Schaum's outlines p.280 Problem 8.13.)
- Solve the system by using Cramer's rule. \[\begin{cases} x+y+z = 5 \\[5pt] x-2y-3z = -1 \\[5pt] 2x+y-z = 3 \end{cases}\] (See: Schaum's outlines p.273 Example 8.12.)
- Let \(A\) be a square matrix. Show that the collection of column vectors of \(A\) is linearly independent if and only if \(\det (A) \ne 0.\)
- A matrix \(A\) is called an integer matrix if all of whose entries are integers. Let \(A\) be a square integer matrix. Show that \(A^{-1}\) is an integer matrix if and only if \(\lvert \det(A) \rvert = 1.\) (Hint: Use Cramer's rule.)
- Let \(A\) be a square matrix of the form(upper triangular block matrix) \[A= \left( \begin{array}{ccccc} P_1 & Q_{12} & Q_{13} & \cdots & Q_{1n} \\ O & P_2 & Q_{23} & \cdots & Q_{2n} \\ O & O & P_3 & \cdots & Q_{3n} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ O & O & O & \cdots & P_n \end{array} \right)\] where all \(P_j\) are square matrices, \(O\) are matrices with zero entries and \(Q_{ij}\) are any matrices of suitable sizes. Show that \[\det (A) = \det (P_1) \, \det(P_2) \, \det(P_3) \cdots \det(P_n) .\]
Day 4.
The problems for the fourth day are related to Eigenvalues and Eigenvectors and Cayley-Hamilton Theorem.
- Let \(f(t) = 2t^2 -t \) and \(g(t) = -t^2 +t -1.\) Find \(f(A)\) and \(g(A)\) for \[A = \left[ \begin{array}{cc} 1 & 2 \\ 3 & 0\end{array} \right].\]
- Show that if \[A = \left[\begin{array}{cc} a_{11} & a_{12} \\ a_{21} & a_{22} \end{array}\right],\] then the characteristic polynomials of \(A\) is \[p_A (t) = t^2 - (a_{11} + a_{22})t + \det(A) = t^2 - \tr(A) t + \det (A).\] Using this formula, find the characteristic polynomial of each of the matrices: \[ A_1 = \left[\begin{array}{rr} 5 & 3 \\ 2 & 10 \end{array}\right],\,\,\, A_2 = \left[\begin{array}{rr} 7 & -1 \\ 6 & 2 \end{array}\right],\,\,\, A_3 = \left[\begin{array}{rr} 5 & -2 \\ 4 & -4 \end{array}\right]. \] (See: Schaum's outlines p.294 for the proof, and p.295 Example 9.3.)
- Find the characteristic polynomial of \[A= \left[\begin{array}{rrr} 1&1&2 \\ 0&3&2 \\ 1&3&9 \end{array}\right] \] (See: Schaum's outlines p.295 Example 9.4.)
- Let \[A = \left[\begin{array}{cc} 3 & 1 \\ 2 & 2 \end{array}\right]. \] Find all eigenvalues of \(A,\) and corresponding eigenvalues.
- Let \(A\) be the matrix in the previous problem. Show that \(A\) is diagonalizable, and find the diagonalization of \(A,\) that is, find an diagonal matrix \(D\) and invertible matrix \(P\) that satisfies \(A = P^{-1} DP.\)
(See: Schaum's outlines p.297 Example 9.5.) - Let \[A = \left[\begin{array}{cr} 3 & -4 \\ 2 & -6 \end{array}\right]. \] Find all eigenvalues of \(A,\) and corresponding eigenvalues.
- Let \(A\) be the matrix in the previous problem. Find a diagonal matrix \(D\) and invertible matrix \(P\) that satisfies \(D = P^{-1} AP.\) Furthermore, find \(A^n\) for a natural number \(n.\)
(See: Schaum's outlines p.309 Problem 9.9.) - Let \[A = \left[\begin{array}{cr} 2 & 2 \\ 1 & 3 \end{array}\right]. \] Find a nonsingular matrix \(P\) such that \(D = P^{-1} AP\) is diagonal. (See: Schaum's outlines p.309 Problem 9.10.)
- Let \(A\) be the matrix in the previous problem. Find \(f(A)\) for \(f(t) = t^{12} - t^6 + t^3 .\)
- Let \(\left\{ f_n \right\}\) be the Fibonacci sequence, that is
\[f_1 = f_2 = 1 ,\,\, f_{n+2} = f_n + f_{n+1} \, (n \ge 1).\]
Use diagonalization of matrix to find the general term for the Fibonacci sequence.
(Hint: Matrix Diagonalization and the Fibonacci Numbers.)
Day 5.
The problems for the fifth day are related to Eigenvalues and Eigenvectors, Cayley-Hamilton Theorem and Jordan Normal Form.
- Let \(T\) be an endomorphism. Show that the scalar \(0\) is an eigenvalue of \(T\) if and only if \(T\) is singular(non-invertible).
(See: Schaum's outlines p.314 Problem 9.18.) - Let \(T\) be an endomorphism. Show that if \(\lambda\) is an eigenvalue of \(T\) and \(T\) is invertible, then \(\lambda^{-1}\) is an eigenvalue of \(T^{-1}.\) (See: Schaum's outlines p.314 Problem 9.18.)
- Let \(V\) be finite dimensional inner product space, \(B = \left\{ u_1 ,\, u_2 ,\, \cdots ,\, u_n \right\}\) be an orthonormal basis for \(V\) and \(T\) be an endomorphism of \(V.\) (i) Describe the definitions of Hermitian transformation and Hermitian matrix. (ii) Show that \(T\) is Hermitian transformation if and only if the matrix \(T = \left(a_{ij}\right)\) of \(T\) with respect to \(B\) is Hermitian matrix.
(See this article, theorem 7.) - Let \(U\) be a unitary transformation on \(V,\) and let \(W\) be a subspace invariant under \(U.\) Show that \(W^\bot\) is also invariant under \(U.\)
(See: Schaum's outlines p.387 Problem 13.11.) - Describe the definition of multiplicity of an eigenvalue.
- Describe the definition of characteristic subspace of \(V\) respect to an endomorphism \(T.\)
- Describe the definitions of nilpotent and index of nilpotency.
- Describe the definitions of diagonal part and nilpotent part.
- Describe the theorem of Jordan's Normal Form.
- Search for and describe: Primary decomposition of a vector space.