Topics in Algebra, Chapter 4.1

2013-09-23 math algebra topics-in-algebra

This page covers section 4.1 (“Elementary Basic Concepts” [of vector spaces and modules]).

Topics covered: 4.1

  • Definition: Let V be a non-empty set, let F be a field, and let +:V×VV and :F×VV be binary operations such that

    1. α(v+w)=αv+αw for all αF, v,wV.
    2. (α+β)v=αv+βv for all α,βF, vV.
    3. α(βv)=(αβ)v for all α,βF, vV.
    4. 1v=v for all vV where 1 is the multiplicative unit in F.

    Then V is said to be a vector space over F. The dot for multiplication will generally be omitted in what follows.

  • Example: If FK are both fields, then K may be viewed as a vector space over F.

  • Example: If F is a field, then Fn={(α1,,αn)αiF}, with the obvious operations, is a vector space over F.

  • Example: If F is a field, then F[x] is a vector space over F.

  • Example: If F is a field, then Pn(F)F[x] is a vector space over F, where Pn(F) is the set of polynomials over F with degree less than n.

  • Definition: If V is a vector space over F and WV forms a vector space using the same operations of V, then W is a subspace of V. This is equivalent to the condition that αw+αwW for all w,wW and α,αF.

  • Definition: Let V,W be vector spaces over F. A homomorphism of vector spaces is a map ϕ:WV such that ϕ(v+v)=ϕ(v)+ϕ(v) and ϕ(αw)=αϕ(w) for all w,wW and αF.

  • The set of all homomorphisms between vector spaces V and W will be denoted Hom(V,W).

  • Lemma 4.1.1: Let V be a vector space over F. Then

    1. α0V=0V for all αF.
    2. 0Fv=0V for all vV.
    3. (α)v=(αv) for all αF, vV.
    4. αv=0 implies α=0F or v=0V.
  • Lemma 4.1.2: Let V be a vector space over F and let WV be a subspace. Then V/W={v+WvV} is a vector space over F, called the quotient space of V by W.

  • Theorem 4.1.1: Let V,W be vector spaces and let ϕ:VW be a surjective homomorphism with kernel K. Then WV/K. Conversely, if V is a vector space and WV a subspace, then there exists a homomorphism ψ:VV/W. (TODO: am I transcribing this correctly?)

  • Definition: Let V be a vector space over F and let W1,,WnV be subspaces. If any vV admits a unique representation v=w1++wn with wiWi for each i, then V is the internal direct sum of the {Wi}.

  • Definition: Let V1,,Vn be vector spaces over F. The external direct sum of the {Vi} is the set {(v1,,vn)viVi}.

  • Theorem 4.1.2: The internal and external direct sums of {V1,,Vn} are isomorphic. Hence we can refer to simply a direct sum having both of the above properties.

The problems below are paraphrased from/inspired by those given in Topics in Algebra by Herstein. The solutions are my own unless otherwise noted. I will generally try, in my solutions, to stick to the development in the text. This means that problems will not be solved using ideas and theorems presented further on in the book.


Herstein 4.1.1

Let V be a vector space over field F. Further, let αF and v,wV. Show that α(vw)=αvαw in V

We have α(vw)=α(v+(w))=αv+α(w)=αvαw

by Lemma 4.1.1.


Herstein 4.1.2

Let F be a field and n a positive integer, let V=Fn and let WF[x] be the vector space of polynomials over F of degree less than n. Prove that VW.

The map ϕ:VW given by ϕ((α0,,αn1))=αn1xn1++α0 is an isomorphism.


Herstein 4.1.3

Prove that the kernel of homomorphism is a subspace.

For ϕ:VW a homomorphism between vector spaces V,W, kerϕ={vWϕ(v)=0}. Let v,vkerϕ and αF, with F the base field. We have ϕ(v+αv)=ϕ(v)+αϕ(v) because ϕ is a homomorphism, and so ϕ(v+αv)=0 and v+αvkerϕ. Thus kerϕ is a subspace of V.


Herstein 4.1.4

(a) Show that the set V of continuous functions [0,1]R is a vector space over R.

(b) For positive integer n, show that the set of functions [0,1]R, for which the first n derivatives exist, form a subspace of the vector space from (a).

(a) Let f,gV and αR. We have f+αgV because sums and scalar products of continuous functions are again continuous. The function 0:x0 is the additive identity in this vector space. Other details can be taken for granted.

(b) The set of n-times differentiable functions is a subset of the continuous functions, so it’s just necessary to check if the set is closed under linear combinations. Indeed, the sum of a differentiable function is again differentiable, so the set in question is a subspace of V.


Herstein 4.1.5

(a) Let V={(a1,a2,)aiR} with all operations defined componentwise. Show that V is a vector space over R.

(b) Let W={(a1,,an,)Vlimnan=0}. Prove that W is a subspace of V.

(c)* Let U={(a1,,an,)Vi=1ai2<}. Prove that U is a subspace of V and that U is contained in W.

(a) Because R is closed under addition and multiplication, and operations on V are defined componentwise, all vector space axioms hold for V.

(b) If (ai) and (bi) are two elements of W and αR, then (ai+αbi)W because limi(ai+αbi)=limiai+αlimibi=0.

(c) Let (ai),(bi)U and let αR. We have that i(ai+αbi)2=iai2+α2ibi2+2αiaibi.

The first two terms are finite by assumption. The third term can be bounded: for real numbers x,y, rearrange (xy)20 to give xy12x2+12y2

so that iaibi12iai2+12ibi2<.

Hence (ai+αbi)U and U is a subspace of V.

To show that U is contained in W, we must show that i=1ai2< implies limiai=0. Define the partial sums sn=i=1nai2; we have that limnsn=limnsn1=L for some L<. Therefore, 0=limn(snsn1)=limnan2.

For any ϵ>0, there exists NN such that n>N implies that an2<ϵ. Let ϵ>0 be given: by the previous statement, there exists NN so that n>N implies an2<ϵ2. Thus for n>N, we also have an<ϵ. This proves that limnan=0, i.e. that (ai)W.


Herstein 4.1.6

Let U,V be vector spaces over field F. Define operations on Hom(U,V) to make it into a vector space over F.

Hom(U,V) is the set of homomorphisms UV. Given ϕ,ψHom(U,V) and αF, we can define a third homomorphism pointwise, i.e. by (ϕ+αψ)(u)=ϕ(u)+αψ(u).

It is straightforward to see that ϕ+αψ is again a homomorphism. Hom(U,V) is a vector space under this pointwise addition and scalar multiplication.


Herstein 4.1.7*

With F a field, prove that Hom(Fn,Fm) is isomorphic to Fmn as vector spaces.

Let ei(n)Fn be the vector with a 1 in the i-th index and zeroes elsewhere. Similarly, let ei(m)Fm be the analogous thing. Given ϕHom(Fn,Fm), we have ϕ(ei(n))=jαij(ϕ)ej(m), defining a matrix of coefficients αij(ϕ)F for each ϕ. Now define f:Hom(Fn,Fm)Fmn by f(ϕ)=(α11(ϕ),,α1m(ϕ),α21(ϕ),,αnm(ϕ)).

That f respects linear combinations is a rote computation. The kernel of f is trivial, so it is an isomorphism.


Herstein 4.1.8

Let F be a field and n>m be positive integers. Exhibit a surjective homomorphism FnFm and show that its kernel is isomorphic to Fnm.

Define ϕ:FnFm by (a1,,am,am+1,,an)(a1,,am). This is a surjective homomorphism. The kernel of ϕ is the set {(0,,0,am+1,,an)Fn}; a similar projection mapping establishes the isomorphism kerϕFnm.


Herstein 4.1.9

Fix nonzero vFn. Show there exists ϕHom(Fn,F) with ϕ(v)0.

Let m be the index of the first non-zero entry in v, and let ϕ be the projection of Fn onto its m-th entry. Then ϕHom(Fn,F) and ϕ(v)0.


Herstein 4.1.10

With F a field and n a positive integer, prove that FnHom(Hom(Fn,F),F).

This is (a special case of) the result that a vector space is isomorphic to its double-dual.

With the result 4.1.7, we have Hom(Hom(Fn,F),F)Hom(Fn,F)Fn.


Herstein 4.1.11

With U,W subspaces of vector space V, all over field F, prove that U+W={u+wuU,wW} is a subspace of V.

Given u,uU, w,wW and αF, we have that (u+w)+α(u+w)=(u+αu)+(w+αw)=u+wU+W,

where the last step is justified because U and W are each subspaces. Therefore U+W is a subspace of V.


Herstein 4.1.12

Prove that the intersection of two subspaces of V is again a subspace of V.

Let U,W be subspaces of V over the field F. If v,vUW and αF, then v+αvU because U is a subspace and v+αvW because W is a subspace. Hence v+αvUW, and UW is a subspace.


Herstein 4.1.13

With U,W subspaces of vector space V, all over field F, prove that (U+W)/WU/(UW).

This is the second isomorphism theorem.

The elements of (U+W)/W look like u+w+W=u+W where uU and wW. The elements of U/(UW) look like u+UW with uU. In both cases, elements of UW get turned into the zero coset.

Define the map ϕ:(U+W)/WU/(UW) by ϕ(u+W)=u+UW. To see that this is well-defined, consider u+w,u+wU+W that belong to the same coset: u+w+W=u+w+W, so that their difference is uuW which then implies that uuUW. We have ϕ(u+w+W)ϕ(u+w+W)=(uu)+UW=UW; thus any representative of a coset in the domain gets mapped to the same coset in the codomain.

ϕ is a homomorphism: for u,uU, w,wW, we have ϕ(u+w+α(u+w)+W)=u+αu+UW while ϕ(u+w+W)+αϕ(u+w+W)=(u+UW)+α(u+UW)=u+αu+UW.

The kernel of ϕ contains those elements which map to 0 in the codomain, i.e. those elements of the domain where the U component belongs to UW. We have that {u+wuUW,wW}W so kerϕ=W, i.e. the kernel is trivial so that ϕ is injective.

ϕ is surjective because, given u+UWU/(UW), we have ϕ(u+W)=u+UW.

Therefore ϕ:(U+W)/WU/(UW) is an isomorphism.


Herstein 4.1.14

Let U,V be vector spaces and let ϕ:UV be a surjective homomorphism. Show there is a one-to-one correspondence between A, the subspaces of V, and B, the subspaces of U which contain kerϕ.

This is the fourth (“lattice”) isomorphism theorem.

There are a couple of natural-looking ways to map the objects in question (I tried Wϕ(W), WU/W, etc.). However, the first isomorphism theorem (theorem 4.1.1) states that VU/kerϕ, so the subspaces of V should probably look like W/kerϕ where W is a subspace of V. Naturally, W/kerϕ only makes sense if W contains kerϕ. Therefore, the map we define is f:BA given by f(W)=W/kerϕ, and it makes sense because of the way we have chosen B (i.e. only considering subspaces that contain kerϕ).

The map f is injective: let W,WB be mapped the same by f, i.e. W/kerϕ=W/kerϕ. We would like to show that this implies W=W. If wW, then w+kerϕW/kerϕ=W/kerϕ so there exists wW with w+kerϕ=w+kerϕ. This implies that wwkerϕW, so that w=(ww)+wW.

This proves that WW. The argument, made in reverse, gives also that WW, so we have proven that f is injective.

f is also surjective: if X is a subspace of V, then we can realize it as the image of a subspace of U. Consider Y=ϕ1(X)={uUϕ(u)X}. It remains to show that kerϕY, that Y is a subspace of U, and that f(Y)=X. Because X is a subspace, 0X and ϕ(kerϕ)={0}X, so that kerϕY. If y,yY and α is a scalar, then ϕ(y+αy)=ϕ(y)+αϕ(y)X, so Y is a subspace of U. Finally, f(Y)={u+kerϕuU,ϕ(u)X}, which is the same thing as X (TODO: fill in details without making notation worse?).

Therefore, f is a bijection between A and B.


Herstein 4.1.15

Let V be a vector space and let V1,,Vn be subspaces of V such that V=V1++Vn and VijiVj={0} for every i. Prove that V is the internal direct sum of V1,,Vn.

To say that V is the internal direct sum of the Vi is to say that vV has exactly one expression v=v1++vn with each viVi.

Because ViVi, we have that vV has at least one such expression. It remains to show that this expression is unique. Therefore, suppose that v=v1++vn=v1++vn with vi,viVi for each i. Then we have (v1v1)++(vnvn)=0,

and, rearranging, vivi=ji(vjvj).

The left hand side belongs to Vi while the right hand side belongs to jiVj. By assumption, those two spaces intersect trivially, so that vivi=0 for each i. Hence the two representations are identical, and we are done.


Herstein 4.1.16

Let V be a vector space with subspaces V1,,Vn such that V=iVi. Prove there are subspaces VˉiV isomorphic to Vi with V the internal direct sum of Vˉi.

V is the external direct sum of the Vi, so it looks like V={(v1,,vn)viVi}.

The subspaces Vˉi which allow the i-th entry to range over Vi, while fixing the non-i entries as zero, are the desired subspaces of V isomorphic to Vi. The conditions of exercise 4.1.15 are easily satisfied, so that V is the internal direct sum of the Vˉi.


Herstein 4.1.17

Let F be a field and let T:F2F2 be defined by T(x1,x2)=(αx1+βx2,γx1+δx2) for fixed α,β,γ,δF.

(a) Prove that T is a homomorphism.

(b) Find necessary and sufficient conditions on α,β,γ,δ so that T is an isomorphism.

(a) That T is a homomorphism is a straightforward exercise

(b) In the language of matrices, this is the familiar question of when a matrix is invertible; the answer is “when the determinant is non-zero”. How does that come about from direct computation?

Let y1,y2F and consider the simultaneous equations αx1+βx2=y1,

γx1+δx2=y2.

Multiplying the first equation by δ and the second by β, and then subtracting the second from the first, we find (αδβγ)x1=δy1βy2.

Performing a similar computation, we also find (αδβγ)x2=αy2γy1.

In order for T to be injective, it must have a trivial kernel. If (x1,x2)kerT, then (αδβγ)x1=(αδβγ)x2=0.

These equations have non-trivial solutions (x1,x2) if and only if αδβγ=0. Thus a necessary and sufficient condition for T to be injective is that αδβγ0. As a result, this is also a necessary condition for T to be an isomorphism.

The same condition is also sufficient for T to be surjective, because the equations (αδβγ)x1=δy1βy2 and (αδβγ)x2=αy2γy1 are solvable for any y1,y2 by dividing by αδβγ.

Therefore the necessary and sufficient condition for T to be an isomorphism is that αδβγ be non-zero.


Herstein 4.1.18

The same exercise as 4.1.17 but on F3.

I haven’t done this exercise, but I would be surprised if it is different from 4.1.17 in a meaningful way.


Herstein 4.1.19

Let V,W be vector spaces and let T:VW. Use T to define a homomorphism T:Hom(W,F)Hom(V,F).

Put another way, the exercise is to show that a homomorphism between vector spaces V and W induces a natural homomorphism between their dual spaces V=Hom(V,F) and W=Hom(W,F).

A diagram helps:

VFW\begin{array}{ccc} V & \rightarrow & F \ & \searrow & \uparrow \ & & W \end{array}

Here, the map VW is provided by T, the map WF is some representative wHom(W,F), and the desired map VF can be made in a natural way by composition. That is, we define a map T:Hom(W,F)Hom(V,F) by T(w)=wT.

It is easy to check that (1) the resulting v=wT is indeed an element of Hom(V,F), and (2) that T is a homomorphism.


Herstein 4.1.20

Let F be a field.

(a) Prove that F is not isomorphic to Fn for integer n>1.

(b) Prove that F2 is not isomorphic to F3.

(a) Looking slightly ahead, the intuition here is that the image of F under a homomorphism will be too low-dimensional. Therefore, consider a supposed isomorphism ϕ:FFn. Because it is surjective, there are f1,f2F with ϕ(f1)=(1,0,,0) and ϕ(f2)=(0,1,0,,0). Now, there exists αF such that f2=αf1, so we must have (0,1,0,,0)=ϕ(f2)=ϕ(αf1)=αϕ(f1)=(α,0,,0).

This is a contradiction, so we conclude that no such ϕ exists.

(b) Suppose ϕ:F2F3 is an isomorphism, ϕ((1,0))=v1F3 and ϕ((0,1))=v2F3. Then we have ϕ((α,β))=αv1+βv2 for any α,βF. Because ϕ is surjective, there must exist αi,βi such that α1v1+β1v2=(1,0,0),

α2v1+β2v2=(0,1,0),

α3v1+β3v2=(0,0,1).

Taking the first and second equations, and eliminating the v2 terms, we find that (α1β2β1α2)v1=(β2,β1,0).

However, taking the first and third equations, and eliminating the v2 terms, we also find that (α1β3β1α3)v1=(β3,0,β1).

These two results are inconsistent unless β1=0. In that case, we can explicitly solve for v1=(α11,0,0) and derive a contradiction that β2v2=(α2α1,1,0) while β3v2=(α3α1,0,1).

Thus we have shown that the map ϕ is not truly surjective, and therefore not an isomorphism.

The laborious arguments above make one appreciate (1) the elegance of doing linear algebra without explicit coordinates/choice of basis, and (2) the simplicity and utility of the concepts of linear independence, basis and dimension, which we eschew here because they are not introduced until the next section of the book.


Herstein 4.1.21

Let V be a vector space over the infinite field F. Prove that V is not realizable as the set-theoretic union of a finite number of its proper subspaces.

Let V1,,Vn be proper subspaces of V such that iVi=V. We can assume that each Vi brings something of value to this union, i.e. that Vi⊄jiVj

In other words, for each i, there exists some vi which only belongs to Vi and none of the other subspaces. If this is not the case, then we can omit this Vi: all of its elements are included elsewhere. In this sense, we can assume our set to have minimal size.

Because the subspaces are proper, we know that n2. Consider elements v1V1 with v1∉i1Vi and v2V2 with v2∉i2Vi. Let α,βF be distinct. The elements x=v1+αv2 and y=v1+βv2 belong to V=iVi, so each belongs to some Vi. Suppose x,y both belong to the same Vi; then so must their difference: xy=(αβ)v2Vi. By assumption, v2 only belongs to V2, so that Vi=V2. Taking a step back, we see that this would force v1=xαv2 to also live in V2, a contradiction. Thus x and y are forced to belong to different subspaces.

Now, we enumerate some infinite subset {α1,α2,} of F and construct the elements xi=v1+αiv2V. Considering the {xi} pairwise, we see that every one must live in a different subspace from every other one: no finite number of subspaces will suffice. We conclude that no vector space over an infinite field can be realized as the union of finitely many of its proper subspaces.

comments powered by Disqus