This set of exercises is retrieved from the seventh chapter of Linear Algebra by Robert J. Valenza. Note that these solutions are not fully elaborated; You have to fill the descriptions by yourself.
Problem 7.1
In \(\mathbb{R}^3,\) compute the inner product of \((1,\,2,\,-1)\) and \((2,\,1,\,4).\) What is the length of each vector? What is the angle between these vectors?
Solution. The lengths of given vectors are \[\begin{aligned} \lVert (1,\,2,\,-1)\rVert &= \sqrt{1^2 + 2^2 + (-1)^2} = \sqrt{6} ,\\[4pt] \lVert (2,\,1,\,4)\rVert &= \sqrt{2^2 +1^2 +4^2} = \sqrt{21} . \end{aligned}\] Let \(\theta\) be the angle between two vectors. Since \[\langle (1,\,2,\,-1) \,\vert\, (2,\,1,\,4)\rangle = 2+2-4 = 0,\] we have \[\cos\theta = \frac{0}{\sqrt{6} \sqrt{21}} = 0,\] that is, two vectors are orthogonal.
Problem 7.2
What is the angle between the vectors \((1,\,2,\,4)\) and \((2,\,5,\,1)\) in \(\mathbb{R}^3?\) You may leave your answer in terms of the inverse cosine function.
Solution. Let \(\theta\) be the angle between two vectors. \[\begin{aligned} \cos\theta &= \frac{\langle (1,\,2,\,4) \,\vert\, (2,\,5,\,1)\rangle}{\lVert (1,\,2,\,4)\rVert \lVert (2,\,5,\,-1)\rVert} \\[4pt] &= \frac{16}{\sqrt{21}\sqrt{30}} = \frac{16}{3\sqrt{70}}. \end{aligned}\] Hence \[\theta = \cos^{-1} \frac{16}{3\sqrt{70}} \approx 0.8796 \,(\approx 50.4 ^\circ ) \]
Problem 7.3
Find all vectors in \(\mathbb{R}^3\) which are orthogonal to both of the following vectors:
\[(1,\,2,\,0) \,\,\text{and}\,\, (1,\,0,\,1).\]
This amounts to a homogeneous system of two equations in three unknowns.
Solution. Since \[(1,\,2,\,0)\times (1,\,0,\,1) = (2,\,-1,\,-2),\] the desired vector is \[k(2,\,-1,\,-2)\] where \(k\) is any real number.
Problem 7.4
Compute the inner product \(\langle f\,\vert\, g\rangle\) in \(C^0 ( [ -\pi ,\, \pi ] )\) for the following functions:
\[f(x)=2x \,\,\,\text{and}\,\,\, g(x)=\sin x.\]
Solution. \[\langle f\,\vert\,g\rangle = \int_{-\pi}^{\pi} 2x \sin x\, dx = 4\pi .\]
Problem 7.5
In the context of the previous problem, find the length of the functions \(f\) and \(g.\) What is the angle between these functions? Interpret the Cauchy-Schwarz Inequality in this special case.
Solution. \[\begin{aligned} \langle f\,\vert\,f\rangle &= \int_{-\pi}^{\pi} 2x\cdot 2x \,dx = \frac{8}{3} \pi^3 , \\[4pt] \lVert f\rVert &= \sqrt{\frac{8}{3}\pi^3} ,\\[4pt] \langle g\,\vert\,g\rangle &= \int_{-\pi}^{\pi} \sin x \cdot \sin x \,dx = \pi ,\\[4pt] \lVert g \rVert &= \sqrt{\pi} . \end{aligned}\] Let \(\theta\) be the angle between \(f\) and \(g.\) \[\cos\theta = \frac{\langle f\,\vert\,g\rangle}{\lVert f \rVert \lVert g \rVert} = \frac{4\pi}{\sqrt{\frac{8}{3}\pi^3}\sqrt{\pi}} = \frac{\sqrt{6}}{\pi}.\] Hence we have \[\langle f\,\vert\,g\rangle = 4\pi \le \sqrt{\frac{8}{3}}\pi^2 = \lVert f \rVert \lVert g\rVert,\] and we see that Cauchy-Schwarz inequality holds.
Problem 7.6
In the inner product space \(C^0 ( [-1 ,\, +1]),\) for which \(n\) are the monomials \(x^n\) orthogonal to the constant function \(1?\)
Solution. Since \[\langle 1\,\vert\,x^n \rangle = \int_{-1}^{1} 1\cdot x^n \,dx = \frac{1+(-1)^n}{n+1},\] \(1\) and \(x^n\) are orthogonal to each other if and only if \(n\) is an odd number.
Problem 7.7
In the context of the previous problem, what is the length of each of the monomials \(x^n ?\)
Solution. \[\lVert x^n \rVert = \sqrt{\langle x^n \,\vert\, x^n \rangle}= \sqrt{\frac{2}{2n+1}}.\]
Problem 7.8
Use the Cauchy-Schwarz Inequality on the appropriate inner product space to bound the definite integral
\[\int_{0}^{\pi/2} \sqrt{x\sin x} \,dx .\]
Solution. Take \(\sqrt{x}\) and \(\sqrt{\sin x}\) in \(C^0 ( [ 0 ,\, \frac{\pi}{2} ] ) .\) Then, by Cauchy-Schwarz inequality, we have \[\begin{aligned} \int_{0}^{\pi /2} \sqrt{x}\sqrt{\sin x} \,dx &= \left\langle \sqrt{x} \,\vert\, \sqrt{\sin x}\right\rangle \\[4pt] &\le \left\lVert \sqrt{x} \right\rVert \left\lVert \sqrt{\sin x} \right\rVert \\[4pt] &= \sqrt{\int_{0}^{\pi /2} x \,dx} \cdot \sqrt{\int_0^{\pi /2} \sin x\,dx} \\[4pt] &= \sqrt{\frac{\pi^2}{8}} \cdot \sqrt{1} = \frac{\pi}{2\sqrt{2}}. \end{aligned}\] In fact, \[\begin{aligned} \int_{0}^{\pi/2} \sqrt{x\sin x}\,dx &\approx 1.10839 ,\\[4pt] \frac{\pi}{2\sqrt{2}} & \approx 1.11072 . \end{aligned}\]
Problem 7.9
Prove that in an \(n\)-dimensional inner product space an orthogonal family may contain at most \(n\) vectors.
Solution. An orthogonal family is linearly independent. An \(n\)-dimensional space cannot contain a linearly independent family of more than \(n\) vectors.
Problem 7.10
Carry on the Gram-Schmidt orthonormalization process on the following pair of vectors in \(\mathbb{R}^2\) to obtain an orthonormal basis:
\[v_1=(2,\,1) \,\,\text{and}\,\,v_2=(-1,\,3).\]
Solution. First, normalize \(v_1 :\) \[\begin{aligned} \lVert v_1 \rVert &= \sqrt{2^2 +1^2} = \sqrt{5}, \\[4pt] u_1 &= \frac{1}{\sqrt{5}} (2,\,1) = \left(\frac{2}{\sqrt{5}} ,\, \frac{1}{\sqrt{5}}\right). \end{aligned}\] Second, find \(u_2 :\) \[\begin{aligned} v_2 - \operatorname{proj}_{v_1} v_2 &= (-1,\,3) - \frac{(2,\,1) \cdot (-1,\,3)}{(2,\,1)\cdot (2,\,1)} (2,\,1) \\[4pt] &= (-1,\,3) - \frac{1}{5} (2,\,1) \\[4pt] &= \left( - \frac{7}{5} ,\, \frac{14}{5}\right), \\[4pt] \left\lvert\left(-\frac{7}{5} ,\, \frac{14}{5}\right)\right\rvert &= \frac{7}{\sqrt{5}} ,\\[4pt] u_2 &= \frac{\sqrt{5}}{7}\left(-\frac{7}{5} ,\, \frac{14}{5}\right) \\[4pt] &= \left(-\frac{1}{\sqrt{5}},\, \frac{2}{\sqrt{5}}\right). \end{aligned}\] Now \(u_1,\) \(u_2\) constitute an orthonormal basis for \(\mathbb{R}^2.\)
Problem 7.11
Apply the Gram-Schmidt orthonormalization process to the vectors
\[v_1=(3,\,4,\,5) \,\,\, \text{and} \,\,\, v_2=(1,\,0,\,1)\]
to obtain an orthonormal pair of vectors with the same span.
Solution. First, normalize \(v_1\): \[\begin{aligned} \lVert v_1 \rVert &= \sqrt{9+16+25} = 5\sqrt{2} ,\\[4pt] u_1 &= \frac{1}{5\sqrt{2}}v_1 = \frac{1}{5\sqrt{2}}(3,\,4,\,5)\\[4pt] &= \left( \frac{3}{5\sqrt{2}} ,\, \frac{4}{5\sqrt{2}} ,\, \frac{1}{\sqrt{2}}\right). \end{aligned}\] Next, find \(u_2\): \[\begin{aligned} v_2 - \operatorname{proj}_{v_1} v_2 &= (1,\,0,\,1) - \frac{8}{50} (3,\,4,\,5) \\[4pt] &= (1,\,0,\,1) - \left( \frac{24}{50} ,\, \frac{32}{50} ,\, \frac{40}{50}\right) \\[4pt] &= \left( \frac{13}{25} ,\, -\frac{16}{25} ,\, \frac{1}{5}\right), \\[4pt] \left\lVert \left( \frac{13}{25} ,\, -\frac{16}{25} ,\, \frac{1}{5}\right) \right\rVert &= \frac{1}{25}\sqrt{169+256+25}\\[4pt] &= \frac{\sqrt{450}}{25} = \frac{3\sqrt{2}}{5},\\[4pt] u_2 &= \frac{5}{3\sqrt{2}} \left(\frac{13}{25} ,\, -\frac{16}{25} ,\, \frac{1}{5}\right) \\[4pt] &= \left( \frac{13}{15\sqrt{2}} ,\, -\frac{16}{15\sqrt{2}} ,\, \frac{1}{3\sqrt{2}} \right). \end{aligned}\] Checkup: \[u_1 \cdot u_2 = \frac{39}{150} - \frac{64}{150} + \frac{1}{6} = 0.\]
Problem 7.12
Let \(V=\mathbb{R}^3\) and let \(W\) be the subspace of \(V\) spanned by the vectors \((1,\,0,\,1)\) and \((0,\,1,\,0).\) What point of \(W\) is closest to the vector \((6,\,2,\,5)?\)
Solution. Let \[\begin{aligned} w_1 &= (1,\,0,\,1), \\[4pt] w_2 &= (0,\,1,\,0), \\[4pt] W &= \operatorname{Span}(w_1 ,\,w_2 ),\\[4pt] v &= (6,\,2,\,5). \end{aligned}\] Since \(w_1 \bot w_2,\) the projection of \(v\) onto \(W\) is as follows. \[\begin{aligned} \operatorname{proj}_{w_1} v &= \frac{w_1 \cdot v}{w_1 \cdot w_1} w_1 = \frac{11}{2} w_1 = \left( \frac{11}{2} ,\, 0 ,\, \frac{11}{2}\right) ,\\[4pt] \operatorname{proj}_{w_2} v &= \frac{w_2 \cdot v}{w_2 \cdot w_2} w_2 = \frac{2}{1} w_2 = (0,\,2,\,0), \\[4pt] \operatorname{proj}_W v &= \operatorname{proj}_{W_1} v + \operatorname{proj}_{W_2} v =\left( \frac{11}{2} ,\,2,\, \frac{11}{2}\right). \end{aligned}\] Hence \((\frac{11}{2} ,\,2,\,\frac{11}{2})\) is the closest vector.
Problem 7.13
In \(\mathbb{R}^3,\) let \(W\) be the subspace spanned by the vectors \((1,\,1,\,2)\) and \((1,\,1,\,-1).\) What point of \(W\) is closest to the vector \((4,\,5,\,-2)?\)
Solution. Let \[\begin{aligned} w_1 &= (1,\,1,\,2),\\[4pt] w_2 &= (1,\,1,\,-1),\\[4pt] W &= \operatorname{Span}(w_1,\,w_2),\\[4pt]\ v &= (4,\,5,\,-2). \end{aligned}\] Since \(w_1 \bot w_2,\) the projection of \(v\) onto \(W\) is as follows. \[\begin{aligned} \operatorname{proj}_{w_1}v &= \frac{w_1 \cdot v}{w_1 \cdot w_1} w_1 = \frac{5}{6}(1,\,1,\,2) = \left( \frac{5}{6} ,\, \frac{5}{6} ,\, \frac{10}{6}\right),\\[4pt] \operatorname{proj}_{w_2}v &= \frac{w_2 \cdot v}{w_2 \cdot w_2} w_2 = \frac{11}{3} (1,\,1,\,-1) = \left(\frac{11}{3} ,\, \frac{11}{3} ,\, -\frac{11}{3}\right),\\[4pt] \operatorname{proj}_{W}v &= \operatorname{proj}_{w_1}v+\operatorname{proj}_{w_2}v \\[4pt] &= \left(\frac{27}{6} ,\, \frac{27}{6} ,\, -\frac{6}{3}\right) = \left( \frac{9}{2} ,\,\frac{9}{2} ,\, -2 \right). \end{aligned}\] Hence \((\frac{9}{2} ,\, \frac{9}{2} ,\, -2)\) is the closest vector.
Problem 7.14
In \(\mathbb{R}^3,\) find the orthogonal projection of \((2,\,2,\,5)\) on the subspace spanned by the vectors \((2,\,1,\,1)\) and \((0,\,2,\,1).\)
Solution. Let \[\begin{aligned} v_1 &= (2,\,1,\,1),\\[4pt] v_2 &= (0,\,2,\,1),\\[4pt] v &= (2,\,2,\,5),\\[4pt] W &= \operatorname{Span}(v_1 ,\,v_2 ). \end{aligned}\] Since \(v_1\) and \(v_2\) are not orthogonal to each other, we have to find an orthogonal basis for \(W.\) The Gram-Schmidt orthogonalization gives us the following: \[\begin{aligned} \lVert v_1 \rVert &= \sqrt{6} ,\\[4pt] u_1 &= \frac{1}{\sqrt{6}}(2,\,1,\,1), \\[4pt] v_2 - \operatorname{proj}_{v_1}v_2 &= (0,\,2,\,1) - \left( 1,\,\frac{1}{2} ,\, \frac{1}{2}\right) = \left( -1 ,\, \frac{3}{2} ,\, \frac{1}{2}\right) ,\\[4pt] \left\lVert \left( -1 ,\, \frac{3}{2} ,\, \frac{1}{2}\right) \right\rVert &= \sqrt{1+\frac{9}{4} + \frac{1}{4}} = \sqrt{\frac{14}{4}} = \frac{\sqrt{14}}{2},\\[4pt] u_2 &= \frac{2}{\sqrt{14}}\left(-1,\,\frac{3}{2} ,\,\frac{1}{2}\right) = \left(-\frac{2}{\sqrt{14}} ,\, \frac{3}{\sqrt{14}} ,\, \frac{1}{\sqrt{14}}\right). \end{aligned}\] Hence \(u_1,\) \(u_2\) constitutes an orthonormal basis for \(W.\) Now we find the projection of \(v\) onto \(W\): \[\begin{aligned} \operatorname{proj}_{u_1}v &= (u_1 \cdot v)u_1 \\[4pt] &= \frac{11}{6} (2,\,1,\,1) = \left(\frac{11}{3} ,\, \frac{11}{6} ,\, \frac{11}{6}\right) ,\\[4pt] \operatorname{proj}_{u_2}v &= (u_2 \cdot v)u_2 \\[4pt] &= 1\left(-1,\, \frac{3}{2} ,\, \frac{1}{2}\right) = \left(-1,\, \frac{3}{2} ,\, \frac{1}{2}\right),\\[4pt] \operatorname{proj}_{W}v &= \operatorname{proj}_{u_1}v + \operatorname{proj}_{u_2}v \\[4pt] &= \left(\frac{11}{3} - 1 ,\, \frac{11}{6} + \frac{3}{2} ,\, \frac{11}{6} + \frac{1}{2}\right)\\[4pt] &=\left( \frac{8}{3},\, \frac{10}{3} ,\, \frac{7}{3}\right). \end{aligned}\] Therefore \(\left( \frac{8}{3},\, \frac{10}{3} ,\, \frac{7}{3}\right)\) is the closest vector.
Problem 7.15
Granting that the functions
\[1,\,\, \cos x,\,\, \sin x,\,\, \cos 2x ,\,\, \sin 2x ,\,\, \cdots\]
constitute an orthogonal family in \(C^0 ([-\pi ,\, +\pi ]),\) modify each function by a scalar to convert this to an orthonormal family.
Solution. Keep calm and call Calculus. \[\begin{aligned} \int_{-\pi}^{\pi} 1\,dx &= 2\pi, \\[4pt] \lVert 1 \rVert &= \sqrt{2\pi} ,\\[4pt] \int_{-\pi}^{\pi} (\cos nx)^2 \,dx &= \pi ,\\[4pt] \lVert \cos nx \rVert &= \sqrt{\pi} ,\\[4pt] \int_{-\pi}^{\pi} (\sin nx)^2 \,dx &= \pi ,\\[4pt] \lVert \sin nx \rVert &= \sqrt{\pi} ,\\[4pt] \int_{-\pi}^{\pi} \sin(mx) \cdot \cos(nx) \,dx &=0 \\[4pt] \int_{-\pi}^{\pi} \sin(mx) \cdot \sin(nx) \,dx &=0 \\[4pt] \int_{-\pi}^{\pi} \cos(mx) \cdot \cos(nx) \,dx &=0, \end{aligned}\] where \(n\) and \(m\) are distinct nonnegative integers. Hence the desired orthonormal family is \[\frac{1}{\sqrt{2\pi}},\,\, \frac{1}{\sqrt{\pi}}\cos x ,\,\, \frac{1}{\sqrt{\pi}}\sin x ,\,\, \frac{1}{\sqrt{\pi}}\cos 2x ,\,\, \frac{1}{\sqrt{\pi}}\sin 2x ,\,\, \cdots .\]
Problem 7.16
Let \(f\in V = C^0 ([-\pi ,\, \pi ]).\) Give a formula for the orthogonal projection of \(f\) onto the subspace of \(V\) spanned by the \(2n+1\) functions \(1,\) \(\cos x,\) \(\sin x,\) \(\cos 2x ,\) \(\sin 2x,\) \(\cdots ,\) \(\cos nx,\) \(\sin nx.\) Expand this in terms of the appropriate definite integrals, which you need not compute. (Thus begins the development of Fourier series.)
Solution. Define \[\begin{aligned} a_0 &= \frac{1}{\pi} \int_{-\pi}^{\pi} f(x) dx , \\[4pt] a_k &= \frac{1}{\pi} \int_{-\pi}^{\pi} f(x)\cos(kx)\,dx ,\\[4pt] b_k &= \frac{1}{\pi} \int_{-\pi}^{\pi} f(x)\sin(kx)\,dx \end{aligned}\] for \(k=1,\,2,\,\cdots,\,n.\) Then \[\begin{aligned} \operatorname{proj}_W(f) &= \frac{a_0}{2} + a_1 \cos x + b_1 \sin x + a_2 \cos 2x + b_2 \sin 2x + \cdots \\[4pt] &\quad + a_n \cos nx + b_n \sin nx . \end{aligned}\]
Problem 7.17
Let \(v\) and \(w\) be vectors in the inner product \(V,\) with \(w\) nonzero. Write down a formula for the orthogonal projection of \(v\) onto the subspace spanned by \(w.\) This is often called simply the projection of \(v\) onto \(w.\)
Solution. \[\operatorname{proj}_w v = \frac{\langle w\,\vert\,v\rangle}{\langle w\,\vert\,w \rangle} w.\]
Problem 7.18
Let \(W\) be the subspace spanned by \((1,\,1,\,1)\) in \(\mathbb{R}^3.\) Find a basis for \(W^\bot,\) the orthogonal complement of \(W.\)
Solution. We will find an orthogonal basis \(v_1,\) \(v_2,\) \(v_3\) for \(V=\mathbb{R}^3,\) with \(\operatorname{Span}(v_1)=W.\)
Since \(W=\operatorname{Span}((1,\,1,\,1)),\) let \(v_1 = (1,\,1,\,1)\) and take any \(v_2,\) \(v_3\) so that \(v_1,\) \(v_2,\) \(v_3\) constitutes a linearly independent family. Take \[v_1 = (1,\,1,\,1),\,\, v_2 = (1,\,0,\,0),\,\, v_3 =(0,\,1,\,0).\] It is easily shown that \(v_1,\) \(v_2,\) \(v_3\) is linearly independent. (If this family were not linearly independent, we could choose any other \(v_3 '\) that is not parallel to none of \(v_1,\) \(v_2\) and \(v_3.\)) Hence \(v_1,\) \(v_2,\) \(v_3\) constitutes a basis for \(V.\)
Apply Gram-Schmidt orthogonalization: \[\begin{aligned} u_1 &= \frac{1}{\sqrt{3}}v_1 = \left(\frac{1}{\sqrt{3}},\,\frac{1}{\sqrt{3}},\,\frac{1}{\sqrt{3}}\right),\\[4pt] \operatorname{proj}_{u_1}v_2 &= \frac{1}{\sqrt{3}}\left(\frac{1}{\sqrt{3}},\,\frac{1}{\sqrt{3}},\,\frac{1}{\sqrt{3}}\right) = \left(\frac{1}{3},\, \frac{1}{3},\, \frac{1}{3}\right),\\[4pt] v_2 - \operatorname{proj}_{u_1}v_2 &= \left(\frac{2}{3} ,\, -\frac{1}{3} ,\, -\frac{1}{3}\right),\\[4pt] \left\lVert v_2 - \operatorname{proj}_{u_1}v_2\right\rVert &= \frac{\sqrt{6}}{3},\\[4pt] u_2 &= \frac{3}{\sqrt{6}}\left(\frac{2}{3},\, -\frac{1}{3} ,\, -\frac{1}{3}\right) = \left(\frac{2}{\sqrt{6}} ,\, -\frac{1}{\sqrt{6}} ,\, -\frac{1}{\sqrt{6}}\right),\\[4pt] \operatorname{proj}_{u_1} v_3 &= \frac{1}{\sqrt{3}}\left(\frac{1}{\sqrt{3}},\,\frac{1}{\sqrt{3}},\,\frac{1}{\sqrt{3}}\right) = \left(\frac{1}{3} ,\, \frac{1}{3} ,\, \frac{1}{3}\right),\\[4pt] \operatorname{proj}_{u_2}v_3 &= -\frac{1}{\sqrt{6}}\left(\frac{2}{\sqrt{6}} ,\, -\frac{1}{\sqrt{6}} ,\, -\frac{1}{\sqrt{6}}\right) = \left(-\frac{1}{3} ,\, \frac{1}{6} ,\, \frac{1}{6}\right) ,\\[4pt] v_3 - \operatorname{proj}_{u_1}v_3 - \operatorname{proj}_{u_2}v_3 &= (0,\,1,\,0) - \left(\frac{1}{3},\,\frac{1}{3},\,\frac{1}{3}\right) - \left(-\frac{1}{3},\,\frac{1}{6},\,\frac{1}{6}\right) \\[4pt] &= \left(0,\,-\frac{1}{2},\,-\frac{1}{2}\right),\\[4pt] u_3 &= \left(0 ,\, -\frac{1}{\sqrt{2}} ,\, -\frac{1}{\sqrt{2}}\right). \end{aligned}\] Since \(u_1,\) \(u_2,\) \(u_3\) is an orthonormal basis for \(V\) and \(\operatorname{Span}(v_1)=W,\) we have \(W^\bot = \operatorname{Span}(u_2 ,\,u_3),\) that is, \(u_2,\) \(u_3\) consitutes a basis for \(W^\bot.\)
Solution(Alt). Take \[v_1 = (1,\,1,\,1),\,\, v_2 =(a,\,-a,\,0),\,\, v_3 = (0,\, -b ,\,b)\] where \(a\) and \(b\) are nonzero real numbers. (Choosing such vectors \(v_2,\) \(v_3\) is depend on your sense of mathematics.) It is easily shown that \(v_1,\) \(v_2,\) \(v_3\) is a linearly independent family and \[v_1 \cdot v_2 =0 ,\,\, v_1 \cdot v_3 =0,\] that is, \[v_1 \bot v_2 \,\,\text{and}\,\, v_1 \bot v_3 .\] Hence \(v_2,\) \(v_3\) constitutes a basis for \(W^\bot.\)
Problem 7.19
Let \(W\) be a finite-dimensional subspace of the inner product space \(V.\) Show that the projection map \(\operatorname{proj}_W : V \rightarrow V\) is a linear transformation. What are the kernel and image of this map?
Solution. If \(W=V,\) then \(\operatorname{proj}_W\) is an identity function; If \(W=\left\{ 0 \right\},\) then \(\operatorname{proj}_W (v)=0\) for any \(v\in V.\) In these cases, \(\operatorname{proj}_W\) is obviously a linearly transformation.
We now consider the cases for \(W\ne V\) and \(W\ne\left\{ 0 \right\}.\)
Let \(u,\,v\in V\) and \(k\in K,\) where \(K\) is the field over which \(V\) is defined. Since \(V = W \oplus W^\bot,\) \(u\) and \(v\) are expressible in the unique forms: \[\begin{aligned} u &= u_1 + u_2 ,\,\, u_1 = \operatorname{proj}_W u \in W ,\,\, u_2 = u-u_1 \in W^\bot ,\\[4pt] v &= v_1 + v_2 ,\,\, v_1 = \operatorname{proj}_W v \in W ,\,\, v_2 = v-v_1 \in W^\bot . \end{aligned}\] We now see that \[u+v = (u_1 + v_1) + (u_2 + v_2);\] Since \(u_1 +v_1 \in W,\) \(u_2 + v_2 \in W^\bot,\) this expression is unique, and we have \[\operatorname{proj}_W(u+v) = u_1 +v_1.\] Hence we have \[\operatorname{proj}_W(u+v) = u_1 +v_1 = \operatorname{proj}_W u + \operatorname{proj}_W v.\] In the same manner, since \[ku = ku_1 + ku_2 ,\,\, ku_1 \in W ,\,\, ku_2 \in W^\bot,\] we have \[\operatorname{proj}_W (ku) = ku_1 = k\operatorname{proj}_W u.\] Therefore \(\operatorname{proj}_W\) is a linear transformation.
Next we find the image of \(\operatorname{proj}_W.\) For any \(v\in V,\) \(\operatorname{proj}_W v\in W,\) and we have \(\operatorname{Im}(\operatorname{proj}_W)\subseteq W.\) Conversely, for any \(w\in W,\) take \(v=w\) then \(\operatorname{proj}_W v = w,\) and we have \(W\subseteq \operatorname{Im}(\operatorname{proj}_W ).\) Hence \(\operatorname{Im}(\operatorname{proj}_W) = W.\)
In the same manner, it is easily shown that \(\operatorname{Ker}(\operatorname{proj}_W) = W^\bot.\) (Use a basis for \(W.\))
Problem 7.20
In the context of the previous problem, what is \(\operatorname{proj}_W \circ \operatorname{proj}_W?\)
Solution. \[\operatorname{proj}_W \circ \operatorname{proj}_W = \operatorname{proj}_W.\]
Problem 7.21
Let \(W\) be a nontrivial, proper subspace of the finite-dimensional inner product space \(V\) and again consider the projection map \(\operatorname{proj}_W : V \rightarrow V.\) Assume that the dimension of \(W\) is \(m.\) Show that there is a basis \(B\) for \(V\) such that the matrix of \(\operatorname{proj}_W\) with respect to \(B\) takes the form
\[
\left(
\begin{array}{c|c}
I_m & 0 \\
\hline
0 & 0
\end{array}
\right)
\]
where \(I_m\) is the \(m\times m\) matrix, and the zeros represent zero matrices of appropriate sizes.
Solution. Let \(\dim(V) = n\) and \(\dim(W)=m.\) Since \(W\) is finite-dimensional, we can construct an orthonormal basis for \(W\): \[v_1 ,\, v_2 ,\, \cdots ,\, v_m.\] Since \(V\) is finite-dimensional, we can extend this family to an orthonormal basis \(B\) for \(V\): \[v_1 ,\, v_2 ,\, \cdots ,\, v_m ,\, \cdots ,\, v_n.\] With this basis, we have \[M_{B,B}(\operatorname{proj}_W) = \left( \begin{array}{c|c} I_m & 0 \\ \hline 0 & 0 \end{array} \right).\]
Problem 7.22
Show that complex conjugation is a bijection map from \(\mathbb{C}\) into itself.
Solution. Let \(\phi : \mathbb{C} \rightarrow \mathbb{C}\) be given by \[\phi(z) = \bar{z}\] for \(z\in \mathbb{C}.\) For any \(z\in \mathbb{C},\) we have \[\phi(\bar{z}) = \bar{\bar{z}} = z;\] Hence \(\phi\) is onto.
Problem 7.23
Show that \(\mathbb{C}^n\) is isomorphic to \(\mathbb{R}^{2n}\) as real vector spaces.
Solution. Define \(\phi : \mathbb{C}^{n} \rightarrow \mathbb{R}^{2n}\) by \[\phi(z_1 ,\, z_2 ,\, \cdots ,\,z_n ) = (\operatorname{Re}(z_1),\, \operatorname{Im}(z_1) ,\, \operatorname{Re}(z_2),\,\operatorname{Im}(z_2),\,\cdots,\,\operatorname{Re}(z_n),\,\operatorname{Im}(z_n)).\] Keep calm and show that \(\phi\) is an isomorphism.
Problem 7.24
Find the length of the following vector in \(\mathbb{C}^2 :\)
\[(2+5i ,\, 1-4i ).\]
Remember that the canonical inner product on \(\mathbb{C}^2\) requires conjugation.
Solution. \[\begin{aligned} \langle (2+5i ,\, 1-4i) \,\vert\, (2+5i ,\,1-4i)\rangle &= (2+5i)\overline{(2+5i)} + (1-4i)\overline{(1-4i)} \\[4pt] &= (2+5i)(2-5i) + (1-4i)(1+4i) \\[4pt] &= 4 + 25 + 1 + 16 = 46 , \\[4pt] \lVert (2+5i ,\, 1-4i ) \rVert &= \sqrt{46}. \end{aligned} \]
Problem 7.25
Let \(V\) be the complex vector space of continuous complex-valued functions on the interval \([-\pi ,\, \pi ].\) Consider the function \(f\in V\) defined by \(f(x)=e^{ix}.\) Find the length of \(f\) in \(V.\)
Solution. Recall that \[f(x) = e^{ix} = \cos x + i\sin x\] for \(x\in\mathbb{R}.\) Hence we have \[\begin{aligned} \int_{-\pi}^{\pi} \langle f\,\vert\,f \rangle dx &= \int_{-\pi}^{\pi} (\cos x + i \sin x)(\cos x - i\sin x) dx \\[4pt] &= \int_{-\pi}^{\pi} (\cos^2 x + \sin^2 x) dx \\[4pt] &= \int_{-\pi}^{\pi} 1\,dx = 2\pi ,\\[4pt] \lVert f \rVert &= \sqrt{2\pi}. \end{aligned}\]
Problem 7.26
In \(\mathbb{C}^n\) we defined the canonical inner product by
\[\langle \mathbf{x} \vert \mathbf{y} \rangle = \sum _ {j=1} ^n x_j \, \overline{y_j}.\]
Why not define it instead more simply as
\[\langle \mathbf{x} \vert \mathbf{y} \rangle = \sum _ {j=1} ^n x_j \,y_j\]
without conjugation of the \(y_j ?\) What essential feature would be lost?
Solution. If the inner product were defined without conjugation, a length of a vector might be complex. In this case, we cannot compare the distances of vectors.
Problem 7.27
Show that in a complex inner product space we have
\[\lVert av \rVert = \lvert a \rvert \cdot \lVert v \rVert\]
for all \(v\in V,\) \(a\in \mathbb{C}.\) (This generalizes another familiar property of real inner products and is needed to extend the proof of Proposition 7.6 to the complex case.)
Solution. \[\begin{aligned} \lVert av \rVert &= \sqrt{\langle av \,\vert\, av \rangle} \\[4pt] &= \sqrt{a\langle v\,\vert\, av \rangle} \\[4pt] &= \sqrt{a\overline{a} \langle v\,\vert\, v \rangle} \\[4pt] &= \sqrt{\lvert a \rvert^2 \langle v\,\vert\, v \rangle} \\[4pt] &= \lvert a \rvert \sqrt{ \langle v\,\vert\, v \rangle} \\[4pt] &= \lvert a \rvert \cdot \lVert v \rVert. \end{aligned}\]
Problem 7.28
Let \(V\) be a complex vector space with basis \(\left\{ v_j \,\vert\, j\in J \right\}\) where \(J\) is some index set, possibly infinite. Then show that \(V\) as a real vector space has the basis \(\left\{ v_j ,\, iv_j \,\,\vert\,\, j\in J \right\}.\) Hence if \(V\) has dimension \(n\) over \(\mathbb{C},\) it has dimension \(2n\) over \(\mathbb{R}.\)
Solution. For any \(v\in V,\) \(v\) is expressed in the unique form \[v = \lambda_1 v_1 + \lambda_2 v_2 + \cdots\] where \(\lambda_j\)s are complex numbers, and all but finite \(\lambda_j\)s are zero. Each \(\lambda_j\) is expressed in the unique form \[\lambda_j = a_j + ib_j\] where \(a_j\) and \(b_j\) are real numbers. Hence \[\begin{aligned} v &= (a_1 + ib_1)v_1 + (a_2 + ib_2)v_2 + \cdots \\[4pt] &= a_1 v_1 + b_1 iv_1 + a_2 v_2 + b_2 iv_2 + \cdots . \end{aligned}\] This expression is unique; Hence \(\left\{v_j ,\, iv_j \,\vert\,j\in J\right\}\) constitutes a basis for \(V\) over \(\mathbb{R}.\)
Problem 7.29
Let \(V\) be a complex inner product space and let \(v\) be any nonzero vector in \(V.\) Show that \(\langle v \,\vert\, iv \rangle \ne 0\) but nevertheless the angle between \(v\) and \(iv\) is \(\pi /2.\) Hence in a complex inner product space, vectors forming a right angle need not be formally orthogonal. How, then, can one reconcile these notions? See the following problem.
Solution. Let \(\theta\) be the angle between \(v\) and \(iv.\) \[\begin{gather} \langle v\,\vert\, iv \rangle = -i \langle v\,\vert\, v \rangle = -i \lVert v \rVert^2 \ne 0,\\[4pt] \cos\theta = \frac{\operatorname{Re}(\langle v\,\vert\, iv\rangle)}{\lVert v \rVert \lVert iv \rVert} = \frac{0}{\lVert v \rVert \lVert iv \rVert} =0. \end{gather}\]
Problem 7.30
Show that for two complex numbers \(z\) and \(w,\) the product \(z \overline{w}\) is purely imaginary if and only if \(z\) and \(w\) are orthogonal as points of \(\mathbb{R}^2.\) This at least reconciles perpendicularity and orthogonality in \(\mathbb{C}.\) Now generalize this to higher dimensions.
Solution. Let \(z=a+bi,\) \(w=c+di\) where \(a,\) \(b,\) \(c,\) \(d\) are real numbers. First, observe that \[z\overline{w} = (a+bi)(c-di) = (ac+bd) + (bc-ad)i,\] hence \(z\overline{w}\) is purely imaginary if and only if \(ac+bd=0.\)
Next, in \(\mathbb{R}^2,\) since \([z]=(a,\,b)\) and \([w]=(c,\,d),\) we have \[\langle z\,\vert\,w\rangle = \langle (a,\,b) \,\vert\,(c,\,d)\rangle = ac+bd.\] Hence \(z\bot w\) if and only if \(ac+bd=0.\)
Therefore \(z\bot w\) in \(\mathbb{R}^2\) if and only if \(z\overline{w}\) is purely imaginary.
We now generalize this to higher dimensions. We only consider for the finite dimensional spaces. Let \(z,\,w\in \mathbb{C}^n,\) then \(z\) and \(w\) are expressible by \[\begin{aligned} z &= (z_1 ,\, z_2 ,\, \cdots ,\, z_n ),\\[4pt] w &= (w_1 ,\, w_2 ,\, \cdots ,\, w_n ),\\[4pt] z_j &= a_j + ib_j ,\\[4pt] w_j &= c_j + id_j \end{aligned}\] where \(a_j,\) \(b_j,\) \(c_j,\) \(d_j\) are real numbers.
First, observe that \[\begin{aligned} \langle z\,\vert\, w\rangle &= z_1 \overline{w_1} + \cdots + z_n \overline{w_n} \\[4pt] &= (a_1 + ib_1)(c_1 - id_1) + \cdots + (a_n +ib_n)(c_n-id_n) \\[4pt] &= [ (a_1 c_1 + b_1 d_1) + \cdots + (a_n c_n + b_n d_n )] + [ (b_1 c_1 - a_1 d_1 ) + \cdots + (b_n c_n -a_n d_n)]i. \end{aligned}\] Hence \(\langle z\,\vert\,w\rangle\) is purely imaginary if and only if \[(a_1 c_1 + b_1 d_1) + \cdots + (a_n c_n + b_n d_n )=0.\] Next, in \(\mathbb{R}^{2n},\) let \[\begin{aligned} \mathbf{x} &= (a_1 ,\, b_1 ,\, a_2 ,\, b_2 ,\, \cdots ,\, a_n ,\, b_n ),\\[4pt] \mathbf{y} &= (c_1 ,\, d_1 ,\, c_2 ,\, d_2 ,\, \cdots ,\, c_n ,\, d_n ). \end{aligned}\] Then \[\mathbf{x}\cdot\mathbf{y} = (a_1 c_1 + b_ 1 d_1) + \cdots + (a_n c_n + b_n d_n ).\] Hence \(\mathbf{x} \bot \mathbf{y}\) if and only if \[(a_1 c_1 + b_1 d_1) + \cdots + (a_n c_n + b_n d_n )=0.\] But for \([z] = \mathbf{x} ,\) \([w] = \mathbf{y}\) in \(\mathbb{R}^{2n},\) we conclude that \(\langle z\,\vert\,w\rangle\) is purely imaginary if and only if \(z\bot w\) in \(\mathbb{R}^{2n},\) that is, \(z\bot w\) in \(\mathbb{C}^n\) if and only if \(z\bot w\) in \(\mathbb{R}^{2n}.\)