# Topics in Algebra, Chapter 4.2

### Topics covered: 4.2

Throughout, $V$ is a vector space over a field $F$.

• Definition: Let ${v}_{1},\dots ,{v}_{n}\in V$ and ${\alpha }_{1},\dots ,{\alpha }_{n}\in F$. Any element of the form ${\sum }_{i}{\alpha }_{i}{v}_{i}$ is a linear combination over $F$ of the $\left\{{v}_{i}\right\}$.

• Definintion: If $S\subset V$ is a subset of $V$, then $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$ is the set of all linear combinations of finite sets of elements of $S$.

• Lemma 4.2.1: If $S\subset V$, then $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$ is a subspace of $V$.

• Lemma 4.2.2: If $S,T\subset V$, then

1. $S\subset T$ implies $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)\subset \mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(T\right)$.
2. $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\cup T\right)=\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)+\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(T\right)$.
3. $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)\right)=\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$.
• Definition: $V$ is finite-dimensional if there exists a finite subset $S\subset V$ such that $V=\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$.

• Definition: The elements of the set $\left\{{v}_{1},\dots ,{v}_{n}\right\}\subset V$ are linearly dependent over $F$ if there exist ${\alpha }_{1},\dots ,{\alpha }_{n}\in F$, not all zero, such that ${\sum }_{i}{\alpha }_{i}{v}_{i}=0$. Otherwise, the elements of the set are linearly independent.

• Lemma 4.2.3: If ${v}_{1},\dots ,{v}_{n}\in V$ are linearly independent, then any element $u$ in their span has exactly one representation $u={\sum }_{i}{\alpha }_{i}{v}_{i}$ for some ${\alpha }_{i}\in F$.

• Theorem 4.2.1: If ${v}_{1},\dots ,{v}_{n}\in V$, then they are either (1) linearly independent or (2) some ${v}_{k}$ is a linear combination of the preceding ${v}_{1},\dots ,{v}_{k-1}$.

• Corollary 1: Let $S=\left\{{v}_{1},\dots ,{v}_{n}\right\}\subset V$ and $W=\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$. If ${v}_{1},\dots ,{v}_{k}$ are linearly independent, then we can construct a linearly independent subset $\left\{{v}_{1},\dots ,{v}_{k},{v}_{{i}_{1}},\dots ,{v}_{{i}_{r}}\right\}\subset S$ whose span is also $W$. In other words, the generating set of $W$ can be pared down to a linearly independent generating set.

• Corollary 2: If $V$ is finite-dimensional, then it contains a finite set $S=\left\{{v}_{1},\dots ,{v}_{n}\right\}$ of linearly independent elements with $V=\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$.

• Definition: $S\subset V$ is a basis of $V$ if the elements of $S$ are linearly independent and $V=\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$. The definition of basis that I give is the classic one (a minimal, spanning set); Herstein’s definition appears to allow for sets which merely contain a conventional basis. I may just be misreading the text.

• Corollary 3: If $S=\left\{{v}_{1},\dots ,{v}_{n}\right\}\subset V$ and $V=\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$, then $S$ contains a basis of $V$. In particular, $V$ (which is finite-dimensional) has a basis with finitely many elements.

• Lemma 4.2.4: If ${v}_{1},\dots ,{v}_{n}\in V$ are a basis of $V$ and if ${w}_{1},\dots ,{w}_{m}\in V$ are linearly independent, then $m\le n$. The proof of this result is subtle and, in my view, inelegant. Is there a nicer proof?

• Corollary 1: If $V$ is finite-dimensional, then any two bases of $V$ have the same number of elements.

• Corollary 2: ${F}^{n}\cong {F}^{m}$ if and only if $m=n$. This relies on an exercise that an isomorphism maps one basis into another.

• Corollary 3: If $V$ is finite-dimensional, then $V\cong {F}^{n}$ for a unique $n\in \mathbb{N}$. Any basis of $V$ has $n$ elements.

• Definition: The integer $n$ in Corollary 3 is the dimension of $V$ over $F$, denoted $n=\mathrm{dim}V$.

• Corollary 4: Any two finite-dimensional vector spaces $V$ and $W$ over $F$ of the same dimension are isomorphic.

• Lemma 4.2.5: If $V$ is finite-dimensional and $\left\{{v}_{1},\dots ,{v}_{n}\right\}\subset V$ are linearly independent, then we can find vectors ${v}_{n+1},\dots ,{v}_{n+r}\in V$ such that $\left\{{v}_{1},\dots ,{v}_{n},{v}_{n+1},\dots ,{v}_{n+r}\right\}$ is a basis of $V$. That is, any linearly independent set can be extended to a basis of the space.

• Lemma 4.2.6: If $V$ is finite-dimensional and if $W\subset V$ is a subspace, then $W$ is finite-dimensional, $\mathrm{dim}W\le \mathrm{dim}V$ and $\mathrm{dim}V\mathrm{/}W=\mathrm{dim}V-\mathrm{dim}W$.

• Corollary: If $A,B$ are finite-dimensional subspaces of $V$ (not necessarily finite-dimensional), then $A+B$ is finite-dimensional and $\mathrm{dim}\left(A+B\right)=\mathrm{dim}A+\mathrm{dim}B-\mathrm{dim}\left(A\cap B\right)$.

The problems below are paraphrased from/inspired by those given in Topics in Algebra by Herstein. The solutions are my own unless otherwise noted. I will generally try, in my solutions, to stick to the development in the text. This means that problems will not be solved using ideas and theorems presented further on in the book.

### Herstein 4.2.1

#### (c) Show that $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)\right)=\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$.

(a) A linear combination in $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$ is of the form $s={\sum }_{i}{\alpha }_{i}{s}_{i}$ where ${\alpha }_{i}\in F$ and ${s}_{i}\in S$. Because ${s}_{i}\in T$ also, we have that $s\in \mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(T\right)$ and hence $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)\subset \mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(T\right)$.

(b) An element $u\in \mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\cup T\right)$ is of the form $u={\sum }_{i=1}^{n}{\alpha }_{i}{u}_{i}$ where ${u}_{i}\in S$ or ${u}_{i}\in T$. Let ${\mathcal{A}}_{S}$ be the set of $i$ such that ${u}_{i}\in S$, and let ${\mathcal{A}}_{T}=\left\{1,\dots ,n\right\}\setminus {\mathcal{A}}_{S}$, the leftovers. Then we have $u=\sum _{j\in {\mathcal{A}}_{S}}{\alpha }_{j}{u}_{j}+\sum _{j\in {\mathcal{A}}_{T}}{\alpha }_{j}{u}_{j}\mathrm{.}$

The first term in the sum belongs to $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$ and the second term belongs to $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(T\right)$. Hence $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\cup T\right)\subset \mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)+\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(T\right)$.

The opposite inclusion is clear by the same type of argument.

(c) By definition $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)\subset \mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)\right)$.

Elements of $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$ are of the form ${\sum }_{i}{\alpha }_{i}{s}_{i}$. Then an arbitrary element $z$ in $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)\right)$ is a linear combination of those, i.e. of the form $z=\sum _{i=1}^{n}{\beta }_{i}\sum _{j=1}^{{m}_{i}}{\alpha }_{ij}{s}_{ij}$

with ${\beta }_{i},{\alpha }_{ij}\in F$ and ${s}_{ij}\in S$. Of course, this may be rewritten as $z=\sum _{i=1}^{n}\sum _{j=1}^{{m}_{i}}\left({\beta }_{i}{\alpha }_{ij}\right){s}_{ij}$

from which we see that $z\in \mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$. Hence $\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(\mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)\right)\subset \mathrm{s}\mathrm{p}\mathrm{a}\mathrm{n}\left(S\right)$ and the result is proven.

### Herstein 4.2.2

#### (b) What conditions on the characteristic of $F$ would make the vectors linearly dependent?

(a) A linear combination of these vectors is of the form $a\left(1,1,0,0\right)+b\left(0,1,-1,0\right)+c\left(0,0,0,3\right)=\left(a,a+b,-b,3c\right)\mathrm{.}$

with $a,b,c\in \mathbb{R}$. If this is to equal the zero vector, then the first, third and fourth components fix each of $a,b,c$ to be zero. Therefore, the set is linearly independent.

(b) If the field is of characteristic $3$, then $\left(0,0,0,3\right)$ is the zero vector, and the set of vectors is linearly dependent.

Otherwise, $3c=0$ implies $c=0$ because fields have no zero divisors. Then $\left(a,a+b,-b,3c\right)=0$ forces all three coefficients to be zero, and the vectors are linearly independent.