This set of exercises is retrieved from the fourth chapter of Linear Algebra by Robert J. Valenza. Note that these solutions are not fully elaborated; You have to fill the descriptions by yourself.
Problem 4.1
Let \(v_1 ,\) \(\cdots ,\) \(v_n \) be linearly independent family in a vector space \(V.\) Show that if \(i\ne j,\) then \(v_i \ne v_j .\) In other words, a linearly independent family can never contain a repeated vector.
Solution. Suppose not, that is, suppose that \(v_i = v_j\) with \(i \ne j\) for some \(i,\) \(j.\) Then \[0v_1 + \cdots + 1 v_i + (-1)v_j + \cdots + 0v_n = \mathbf{0},\] which means \(v_1 ,\) \(\cdots ,\) \(v_n \) are not linearly independent.
Problem 4.2
Show that the following vectors are linearly independent in \(\mathbb{R}^3:\)
\[(1,\,1,\,1) \,\, \text{and} \,\, (0,\,2,\,5).\]
Solution. Suppose that \[a (1,\,1,\,1) + b(0,\,2,\,5) = (0,\,0,\,0)\] for scalars \(a\) and \(b.\) This equation yields \[(a,\,a+2b ,\, a+5b) = (0,\,0,\,0)\] or \[\begin{cases} a=0 \\[6pt] a+2b = 0 \\[6pt] a+5b = 0 . \end{cases}\] This system only has the solution \(a=0 ,\) \(b=0.\)
Problem 4.3
Show that the following functions are linearly independent in \(C^0 ( \mathbb{R} ):\)
\[\sin x \,\, \text{and} \,\, \cos x .\]
Solution. Suppose that \[a \sin x + b \cos x = 0 \tag{*}\] for scalars \(a\) and \(b,\) and for all real numbers \(x.\)
Suppose further that \(a \ne 0\) or \(b\ne 0.\) Then there exists \(\theta\) uniquely such that \[\cos \theta = \frac{a}{\sqrt{a^2 + b^2}} ,\,\, \sin \theta = \frac{b}{\sqrt{a^2 + b^2}} ,\,\, 0 \le \theta < 2\pi .\] Thus (*) becomes \[\cos\theta \sin x + \sin\theta \cos x = 0\] or \[\sin(x+\theta ) = 0 .\] But this equation holds only for some \(x,\) not all \(x.\) Therefore both \(a\) and \(b\) have to equal zero.
Problem 4.4
Give an example of a basis for \(\mathbb{R}^2\) other than the canonical basis.
Solution. Take \((1,\,1),\) \((1,\,2),\) for example.
Problem 4.5
Show that each vector \(x\in\mathbb{R}^3\) can be expressed uniquely as a linear combination of the following vectors:
\[u = (1,\,1,\,1) ,\,\, v= (-1,\,1,\,0),\,\, w=(2,\,0,\,0).\]
Conclude that \(u,\) \(v\) and \(w\) constitute a basis for \(\mathbb{R}^3.\)
Solution. Let \(x = (x_1 ,\, x_2 ,\, x_3 )\) be given. Take \[\begin{align} a &= x_3 , \\[7pt] b &= x_2 - x_3 ,\\[7pt] c &= \frac{1}{2} x_1 + \frac{1}{2} x_2 - x_3 , \end{align}\] then \(x\) is expressed as \[x = au + bv + cw, \tag{*}\] that is, \(u,\) \(v\) and \(w\) span \(\mathbb{R}^3.\)
It remains to prove the uniqueness of the expression (*). Suppose \(a ' ,\) \(b ' \) and \(c ' \) be scalars such that \[a ' u + b ' v + c ' w = 0 .\tag{**}\] Combining (*) and (**) produces \[(a - a ' ) u + (b - b ' ) v + ( c - c ' ) w = 0;\] Solving this equation as a linear system, we have \[a - a ' = b - b ' = c - c ' = 0,\] which yields the uniqueness of the expression (*).
Problem 4.6
Let \(V\) be the vector space of real polynomials in the indeterminate \(x\) of degree less than or equal to \(2.\) Given that the polynomials
\[1,\,\, 1+x \,\, \text{and} \,\,1-x^2 \]
constitute a basis \(B\) for \(V,\) find the coordinate of the following polynomial relative to this basis:
\[1 - 2x +5x^2 .\]
In other words, compute \(\gamma_B\) for this polynomial.
Solution. Let \(a,\) \(b\) and \(c\) be scalars satisfying \[1-2x+5x^2 = a1 +b(1+x) + c(1-x^2 ).\] From this equation, we have \[1-2x+5x^2 = (a+b+c) + bx + (-c)x^2\] or \[\begin{cases} a+b+c=1 \\[6pt] b=-2 \\[6pt] c =-5, \end{cases}\] that is, \(a=8,\) \(b=-2\) and \(c=-5.\)
Problem 4.7
Let \(V\) be the subspace of \(C^0 ( \mathbb{R} )\) spanned by the functions \(e^x\) and \(e^{2x} .\) Show that these functions constitute a basis for \(V.\) What is the value of the associated coordinate map for the function \(-2e^x + 5e^{2x} ? \)
Solution. We only have to show that \(e^x\) and \(e^{2x}\) are linearly independent. Suppose \(a\) and \(b\) are scalars satisfying \[ae^x + be^{2x}=0. \tag{*}\] Suppose further that \(a \ne 0\) or \(b \ne 0.\) If one of \(a\) and \(b\) equals to zero, the so the other, thus in this case we have \(a \ne 0\) AND \(b \ne 0.\) From (*), \[ae^x = -be^x\] or \[a = -be^x.\tag{**}\] But LHS of (**) is constant while RHS is not, which is a contradiction. Therefore \(a = b = 0\) and we conclude that \(e^x\) and \(e^{2x}\) are linearly independent.
The value of the associated coordinate map for the function \(-2e^x + 5e^{2x}\) is trivially \((-2 ,\, 5).\)
Problem 4.8
Let \(a_1 ,\) \(a_2 ,\) \(a_3 ,\) \(a_4\) be nonzero real numbers. Show that the following set of vectors constitutes a basis \(B\) for \(\mathbb{R}^4 :\)
\[\begin{gather} v_1 = (a_1 ,\,0,\,0,\,0) ,\,\, v_2 =(a_1 ,\,a_2 ,\,0,\,0) ,\\[6pt]
v_3=(a_1 ,\,a_2,\,a_3,\,0),\,\, v_4=(a_1,\,a_2,\,a_3,\,a_4 ).\end{gather}\]
Find a general formula for the effect of the coordinate map \(\gamma_B\) on a vector \((x_1 ,\,x_2,\,x_3,\,x_4 ) \in \mathbb{R}^4 .\) (This yields a modest example of a lower triangular system of equations: not how easily it is solved.)
Solution. Let \(x=(x_1 ,\,x_2 ,\,x_3,\,x_4)\in\mathbb{R}^4\) be given. Take \[a = x_1 - x_2 ,\,\, b = x_2 - x_3 ,\,\, c = x_3 - x_4 ,\,\, d = x_4,\] then \[x = av_1 + bv_2 + cv_3 + dv_4 .\] Therefore \(v_1 ,\) \(v_2 ,\) \(v_3 \) and \(v_4\) span \(\mathbb{R}^4.\) Linear independence of these vectors is trivial.
Problem 4.9
Extend the following linearly independent set to a basis for \(\mathbb{R}^3:\)
\[v_1=(1,\,0,\,1) \,\, \text{and} \,\, v_2=(1,\,1,\,0).\]
Be sure to establish that your answer indeed constitute a basis.
Solution. Take \(v_3 = (0,\,1,\,1),\) then it is trivial that \(v_1 ,\) \(v_2 \) and \(v_3\) are linearly independent. Let \(x=(x_1 ,\,x_2,\,x_3) \in \mathbb{R}^3\) be given. Take \[\begin{gather} a = \frac{1}{2} (x_1 - x_2 + x_3 ), \\[5pt] b = \frac{1}{2} (x_1 + x_2 - x_3 ), \\[5pt] c = \frac{1}{2} (-x_1 + x_2 + x_3 ) \end{gather}\] then we have \[x= av_1 + bv_2 + cv_3,\] that is, \(v_1,\) \(v_2\) and \(v_3\) span \(\mathbb{R}^3.\)
Problem 4.10
Give two examples of a real vector space of dimension \(4.\)
Solution. The first example is the space consists of polynomial functions of degree less than or equal to \(3.\) This space has the basis \(\left\{ 1,\, x ,\, x^2 ,\, x^3 \right\}.\)
The second one is the space spanned by \(e^x ,\) \(e^{2x},\) \(e^{3x},\) \(e^{4x}.\)
Problem 4.11
Give an example of an infinite-dimensional real vector space.
Solution. Let \(S\) be the space consists of real sequences with all but finite zero terms. Define the sequence \(b^{(n)} = \left\{ b^{(n)}_k \right\} \) by \(b^{(n)}_k = \delta_k^n,\) the Kronecker delta. Then \[b^{(1)},\, b^{(2)},\, b^{(3)},\,\cdots\] substitute a basis for \(S.\)
Problem 4.12
Given that the solution space to the differential equation
\[y ' ' - 2 y ' + y = 0\]
is a subspace of \(C^2 (\mathbb{R})\) of dimension \(2.\) Show that the function
\[e^x \,\,\text{and}\,\, xe^x\]
constitute a basis for this subspace. What, then, is the general solution to this equation?
Solution. The given equation is a second-order linear homogeneous differential equation. Take \(y=e^{\lambda x}\) then \[y ' = \lambda e^{\lambda x} \,\,\text{and}\,\, y ' ' = \lambda^2 e^{\lambda x}.\] Substitute these into differential equation, we obtain \[\lambda^2 e^{\lambda x} - 2\lambda e^{\lambda x} + e^{\lambda x} = 0.\] Factorization gives \[e^{\lambda x} (\lambda^2 - 2\lambda +1 ) = 0,\] that is, \(\lambda = 1.\) Therefore the solution is the function of form \[y = c_1 e^{\lambda x} + c_2 x e^{\lambda x} = c_1 e^x +c_2 xe^x\] for constants \(c_1\) and \(c_2 .\) Since \(e^x\) and \(xe^x\) are independent, it completes the proof.
Problem 4.13
Let \(T:V \rightarrow V ' \) be a linear transformation of real vector spaces. Show that the solution set to the equation \(T(v)=\mathbf{0}\) consists of either a single element or infinitely many elements, according to the dimension of the kernel of \(T.\)
Solution. The solution set is exactly the kernel of \(T.\) If the dimension of the kernel of \(T\) equals to zero, then the set must consist of zero vector only. If the dimension of the kernel of \(T\) is positive, then there must be a nonzero vector \(v\) that satisfies \(T(v) = \mathbf{0},\) and all the vectors of the form \(\lambda v\) where \(\lambda\) is a scalar belongs to the kernel, that is, the kernel consists of infinitely many vectors.
Problem 4.14
Let \(T:V \rightarrow V ' \) be a linear transformation of real vector spaces and let \(v ' \) be an arbitrary element of \(V ' .\) Show that the solution set to the equation \(T(v) = v ' \) is either empty, or consists of a single element, or consists of infinitely many elements.
Solution. If \(v ' \) is not in the image of \(T,\) then the solution set is empty.
Now we suppose that \(T(v) = v ' \) has at least one solution, say \(u.\) If \(\operatorname{dim}(\operatorname{Ker}(T))=0,\) then \(u\) is the only solution for the equation. If \(\operatorname{dim}(\operatorname{Ker}(T)) > 0,\) then all the vectors of the form \(u+w\) where \(w\) belongs to the kernel of \(T\) are solutions, that is, \(T(v)=\mathbf{0}\) has infinitely many solutions.
Problem 4.15
Let \(T:V \rightarrow V '\) be a surjective linear transformation. Show that if \(v_1 ,\) \(\cdots,\) \(v_n\) span \(V,\) then \(T(v_1),\) \(\cdots,\) \(T(v_n )\) span \(V ' .\)
Solution. Let \(v ' \in V ' \) be given. Since \(T\) is surjective, there exists \(v\in V\) such that \(T(v) = v ' .\) For \(v_1 ,\) \(\cdots,\) \(v_n\) span \(V,\) we have \[v = a_1 v_1 + \cdots + a_n v_n\] for some scalars \(a_1,\) \(\cdots,\) \(v_n.\) Hence, from the linearity of \(T,\) we have \[v ' = T(v) = T(a_1 v_1 + \cdots + a_n v_n) = a_1 T(v_1) + \cdots + a_n T(v_n),\] that is, \(v '\) belongs to the span of \(T(v_1),\) \(\cdots ,\) \(T(v_n).\)
Problem 4.16
Let \(T:V \rightarrow V '\) be an injective linear transformation. Show that if \(v_1 ,\) \(\cdots,\) \(v_n\) is a linearly independent family in \(V,\) then \(T(v_1),\) \(\cdots,\) \(T(v_n)\) is a linearly independent family in \(V ' .\)
Solution. Suppose \(a_1,\) \(\cdots ,\) \(a_n\) are scalars satisfying \[a_1 T(v_1) + \cdots + a_n T(v_n ) = \mathbf{0}.\] Since \(T\) is linear, we have \[T(a_1 v_1 + \cdots + a_n v_n) = a_1 T(v_1) + \cdots + a_n T(v_n ) = \mathbf{0} = T(\mathbf{0}),\] and since \(T\) is injective, we have \[a_1 v_1 + \cdots + a_n v_n = \mathbf{0}.\] But for \(v_1,\) \(\cdots,\) \(v_n\) are linearly independent, this equality holds only if \[a_1 = \cdots = a_n = 0.\]
Problem 4.17
Let \(T : V \rightarrow V ' \) be an isomorphism of vector spaces. Show that if \(v_1,\) \(\cdots,\) \(v_n\) is a basis for \(V,\) then \(T(v_1),\) \(\cdots,\) \(T(v_n)\) is a basis for \(V ' .\)
Solution. An isomorphism is injective and surjective. Therefore the result directly comes from the precious two problems 15 and 16.
Problem 4.18
Let \(V\) be finitely-generated vector space over \(K\) and let \(S\) be a (possibly infinite) spanning set for \(V.\) Show that there exists a finite subset of \(S\) that also spans \(V.\)
Solution. If \(S\) is finite, there is nothing to prove. Hence, WLOG, assume that \(S\) is infinite. Take a nonzero vector \(v_1\) in \(S.\) If \(v_1\) spans \(V,\) then the proof is completed. If not, then take \(v_2 \in S\) such that \(v_2\) is nonzero and \(v_1,\) \(v_2\) are linearly independent. If \(v_1,\) \(v_2\) span \(V,\) then the proof is completed. If not, then take \(v_3\in S,\) such that \(v_3\) is nonzero and \(v_1,\) \(v_2,\) \(v_3\) are linearly independent. This process must end in finite steps, for if not, then there could be infinite sequence \(v_1 ,\) \(v_2 ,\) \(\cdots \) consisting of linearly independent terms, which is contradictory to the fact that \(V\) is finitely-generated.
Solution(alt). Since \(V\) is finitely-generated, there are finite vectors \(w_1,\) \(\cdots,\) \(w_n\) that span \(V.\) For \(S\) spans \(V,\) each \(w_j\) can be expressed as a linear combination of finite elements of \(S,\) that is, there exist scalars \(a_1,\) \(\cdots,\) \(a_m\) and vectors \(v_1,\) \(\cdots,\) \(v_m\) in \(S\) such that \[w_j = a_1 v_1 + \cdots + a_m v_m .\] Let \(W_j\) be the set of such vectors \(v_1,\) \(\cdots,\) \(v_m\) in \(S,\) then each set \(W_j\) is finite. Take \(W = W_1 \cup \cdots \cup W_n ,\) then \(W\) is a finite subset of \(S\) and all \(w_j\)s are generated by \(W.\) Hence \(W\) spans \(V.\)
Problem 4.19
Suppose that \(T\) is a linear transformation from a vector space \(V\) of dimension \(3\) to a vector space \(V ' \) of dimension \(2.\) Use the Rank-Nullity Theorem to show that \(T\) is not injective.
Solution. If \(T\) were injective, then \(\operatorname{dim}(\operatorname{Ker}(T))=0.\) Thus, by Rank-Nullity Theorem, we have \[\begin{align} 2 &= \operatorname{dim}(V ' ) \\[7pt] &\ge \operatorname{dim}(\operatorname{Im}(T)) \\[7pt] &= \operatorname{dim}(V) - \operatorname{dim}(\operatorname{Ker}(T)) = 3, \end{align}\] which is impossible. Therefore \(T\) cannot be injective.
Problem 4.20
Suppose that \(T\) is a linear transformation from a vector space \(V\) of dimension \(3\) to a vector space \(V ' \) of dimension \(4.\) Use the Rank-Nullity Theorem to show that \(T\) is not surjective.
Solution. If \(T\) were surjective, by Rank-Nullity Theorem, we have \[\begin{align} 4 &= \operatorname{dim}(V ' ) \\[7pt] &= \operatorname{dim}(\operatorname{Im}(T)) \\[7pt] &= \operatorname{dim}(V) - \operatorname{dim}(\operatorname{Ker}(T)) \\[7pt] &\le \operatorname{dim}(V) = 3, \end{align}\] which is impossible.
Problem 4.21
Let \(V\) be the vector space of real polynomials of degree less than or equal to \(2\) in the indeterminate \(x,\) and consider the linear transformation
\[T : V \rightarrow V ,\quad p \mapsto \frac{d^2 p}{dx^2} - p .\]
Show that the kernel of \(T\) is trivial. Deduce from this that the map \(T\) is an isomorphism.
Solution. Let \(p = ax^2 + bx +c \in V,\) then a direct calculation shows that \[T(p) = -ax^2 - bx + 2a-c.\] Suppose \[T(p) = -ax^2 - bx +2a-c = 0,\] then we have \[\begin{cases} -a =0 \\[5pt] -b =0 \\[5pt] 2a-c = 0 \end{cases}\] or, equivalently, \(a=b=c=0,\) that is, \(p=0.\) Therefore the kernel of \(L\) is trivial and we conclude that \(L\) is injective.
Problem 4.22
Let \(T:V \rightarrow W\) be a linear transformation and assume that \(\operatorname{dim}(V) = 6\) while \(\operatorname{dim}(W) =4.\) What are the possible dimension for \(\operatorname{Ker}(T)?\) Can \(T\) be injective? Why or why not?
Solution. By the Rank-Nullity Theorem, we have \(2 \le \operatorname{dim}(\operatorname{Ker}(T)) \le 6.\) Henceforth \(T\) cannot be injective, since if it could, the nullity had to be zero.
Problem 4.23
Let \(V\) be the vector space of real polynomials of degree less than or equal to \(2.\) Define \(T:V \rightarrow \mathbb{R}\) by
\[T(p) = \int_{-1}^{1} p(x) dx.\]
Show that \(T\) is linear. What is the dimension of the kernel of \(T?\)
Solution. Let \(p = ax^2 +bx +c \in V,\) then a direct calculation shows that \[T(p) = \frac{2}{3} a + 2c.\tag{*}\] Let \(p_1 = a_1 x^2 + b_1 x + c_1\) and \(p_2 = a_2 x^2 + b_2 x + c_2\) be vectors in \(V\) and \(\lambda\) be a scalar. Then \[\begin{align} T(p_1 + \lambda p_2) &= T((a_1 + \lambda a_2)x^2 + (b_1 + \lambda b_2 )x + (c_1 + \lambda c_2)) \\[6pt] &= \frac{2}{3}(a_1 + \lambda a_2) + 2(c_1 + \lambda c_2) \\[6pt] &= \left( \frac{2}{3} a_1 + 2 c_1 \right) + \lambda \left( \frac{2}{3} a_2 + 2c_2 \right) \\[6pt] &= T(p_1 ) + \lambda T(p_2). \end{align}\] Therefore \(T\) is linear.
From (*), we see that every element of kernel of \(T\) is a function of the form \[p(x) = -3cx^2 + bx +c .\] Hence the kernel is generated by two vectors \[p_1 (x) = -3x^2 +1 ,\quad p_2 (x) = x .\] These vectors are linearly independent; Therefore the dimension of kernel is \(2.\)
Problem 4.24
Let \(V\) and \(V ' \) be finite-dimensional vector spaces over a common field and suppose that \(\operatorname{dim}(V) \ge \operatorname{dim}(V ' ).\) Show that there exists a surjective linear transformation from \(V\) to \(V ' .\)
Solution. Let the dimensions of \(V\) and \(V ' \) be \(m\) and \(n\) respectively. Suppose \[v_1 ,\, \cdots ,\, v_m\] be a basis for \(V.\) Define a linear transformation \(T : V \rightarrow V ' \) by \[T(v) = a_1 T(v_1) + a_2 T(v_2) + \cdots + a_n T(v_n )\] for \(v = a_1 v_1 + \cdots + a_m v_m\) in \(V,\) that is, \(T\) cuts off last \(n-m\) terms. Then \(T\) is linear and surjective.
Problem 4.25
Let \(V\) be a vector space with finite-dimensional subspace \(W_0\) and \(W_1\) such that \(V = W_0 \oplus W_1 .\) Suppose further that \(u_1,\) \(\cdots,\) \(u_n\) is a basis for \(W_0\) and \(w_1,\) \(\cdots,\) \(w_m\) is a basis for \(W_1.\) Show that \(V\) is likewise finite-dimensional and that \(u_1,\) \(\cdots,\) \(u_n,\) \(w_1,\) \(\cdots,\) \(w_m\) is a basis for \(V.\)
Solution. If \(v\in V,\) then \(v= u+w\) for some \(u\in W_0\) and \(w\in W_1.\) Since \(u\) is a linear combination of \(u_1 ,\) \(\cdots,\) \(u_n\) and \(w\) is a linear combination of \(w_1,\) \(\cdots,\) \(w_m,\) \(v=u+w\) is a linear combination of \(u_1,\) \(\cdots,\) \(u_n,\) \(w_1,\) \(\cdots,\) \(w_m.\)
It remains to show that \(u_1,\) \(\cdots,\) \(u_n,\) \(w_1,\) \(\cdots,\) \(w_m\) are linearly independent. Suppose \[a_1 u_1 + \cdots + a_n u_n + a_{n+1}w_1 + \cdots + a_{n+m}w_m = 0.\] Since \[0+0=0 ,\quad 0\in W_0 ,\quad 0\in W_1,\] by the uniqueness of the expression of direct sum, we have \[a_1 u_1 + \cdots + a_n u_n = 0\tag{*}\] and \[a_{n+1}w_1 + \cdots + a_{n+m}w_m = 0.\tag{**}\] Since \(a_1,\) \(\cdots,\) \(a_n\) are linearly independent, from (*), we have \[a_1 = \cdots = a_n = 0;\] Likewise, from (*), we have \[a_{n+1} = \cdots = a_{n+m} = 0.\] Hence \(u_1,\) \(\cdots,\) \(u_n,\) \(w_1,\) \(\cdots,\) \(w_m\) are linearly independent.
Problem 4.26
Let \(V\) be a finite-dimensional vector space over a field \(K\) and assume that \(V\) has basis \(v_1,\) \(\cdots ,\) \(v_n .\) Show that
\[V = Kv_1 \oplus Kv_2 \oplus \cdots \oplus Kv_n\]
where \(Kv_j\) denotes the subspace spanned by the single vector \(v_j\) in \(V.\)
Solution. Linear independence of \(v_1,\) \(\cdots,\) \(v_n\) shows that \[Kv_i \cap Kv_j = \left\{ \mathbf{0} \right\} \,\,\text{if}\,\, i\ne j.\] Now we only have to show that \[V = Kv_1 + Kv_2 + \cdots + Kv_n.\tag{*}\] Let \(v\in V\) then \[v = a_1 v_1 + \cdots + a_n v_n \tag{**}\] for some scalars \(a_1,\) \(\cdots,\) \(a_n.\) Since \(a_j v_j \in Kv_j ,\) we obtain the desired result directly from (*).
Problem 4.27
Let \(V\) be a vector space of dimension \(1\) over a field \(K\) and choose a fixed nonzero element \(v_0 \in V,\) which is therefore a basis. Let \(W\) be any vector space over \(K\) and let \(w_0 \in W\) be an arbitrary vector. Show that there is a unique linear transformation \(T : V \rightarrow W\) such that \(T(v_0) = w_0.\)
Solution. Define \(T(kv_0 ) = kw_0\) for any scalar \(k.\) Now is the time for you to prove the uniqueness of the function with this property.
Problem 4.28
Let \(V\) be a finite-dimensional vector space over \(K\) and let \(W\) be any vector space over \(K.\) Suppose that \(v_1,\) \(\cdots ,\) \(v_n\) is a basis for \(V\) and that \(w_1,\) \(\cdots ,\) \(w_n\) is an arbitrary family in \(W.\) Use the two previous problem and the universal property of internal direct sums to show that there exists a unique linear transformation \(T:V \rightarrow W\) such that \(T(v_j ) = w_j\) for \(j =1,\,\cdots,\,n.\)
Solution. Define \[T(a_1 v_1 + \cdots + a_n v_n ) = a_1 w_1 + \cdots + a_n w_n\] for any scalars \(a_1,\) \(\cdots,\) \(a_n .\)
Problem 4.29
Let \(V\) be finite-dimensional vector space and assume that \(W_0\) is a subspace of \(V.\) Show that there exists a complementary subspace \(W_1\) of \(V\) such that \(V = W_0 \oplus W_1 .\)
Solution. In this problem, we need one more assumption: \[ 1\le \operatorname{dim}(W_0) < \operatorname{dim}(V),\] which is omitted in Valenza's book.
Let \(n=\operatorname{dim}(V)\) and \(m=\operatorname{dim}(W_0 ).\) Suppose \[v_1 ,\, \cdots ,\, v_n\] is a basis for \(V.\) Then, among \(v_1,\) \(\cdots,\) \(v_n,\) exactly \(m\) elements belong to \(W_0,\) say \(v_1,\) \(\cdots,\) \(v_m.\) (Prove it!) Take \(W_1\) be the span of \(v_{m+1},\) \(\cdots,\) \(v_n\) then we obtain the desired result.
Problem 4.30
Let \(T : V \rightarrow V '\) be a surjective linear transformation of finite-dimensional vector space, provided that \[\operatorname{dim}(V) > \operatorname{dim}(V ' ) \ge 1.\] Show that there exists a subspace \(W\) of \(V\) such that
\[V = \operatorname{Ker}(T) \oplus W \quad \text{with} \quad W \cong V ' .\]
Solution. Let \(n = \operatorname{dim}(V)\) and \(m=\operatorname{dim}(V ' ),\) then by the Rank-Nullity Theorem we have \[\operatorname{dim}(\operatorname{Ker}(T)) = n-m.\] Suppose \(v_1 ,\) \(\cdots,\) \(v_n\) is a basis for \(V,\) then only \(n-m\) elements among these basis elements belong to \(\operatorname{Ker}(T),\) say \(v_{m+1},\) \(\cdots,\) \(v_n .\) Let \(W\) be the span of \(v_1,\) \(\cdots,\) \(v_n,\) then, by Problem 29, we have \[V = \operatorname{Ker}(T) \oplus W.\] Let \(u_1 ,\) \(\cdots,\) \(u_m\) be a basis for \(V ' ,\) and define a linear transformation \(L : W \rightarrow V '\) by \[L(a_1 v_1 + \cdots + a_m v_m) = a_1 u_1 + \cdots + a_m u_m\] for scalars \(a_1,\) \(\cdots,\) \(a_m,\) then \(L\) is an isomorphism, that is, \(W \cong V ' .\) (In general, two vector spaces over a common field with the same finite dimension are always isomorphic.)
Problem 4.31
Give an example of an infinite-dimensional vector space \(V\) and a linear transformation \(T:V \rightarrow V\) such that both \(\operatorname{Im}(T)\) and \(\operatorname{Ker}(T)\) are also infinite dimensional.
Solution. Take \(V = \mathbb{R}[x]\) and define \(T\) by \[T(a_0 + a_1 x^1 + a_2 x^2 + a_3 x^3 + a_4 x^4 + \cdots) = a_0 + a_2 x^2 + a_4 x^4 + \cdots\] where \(a_j = 0\) for all but a finite number of \(j\)s.