### Topics in Algebra, Chapter 3.6

###### 2012-12-27

This page covers section 3.6 (“The Field of Quotients of an Integral Domain”). Throughout, $R$ is an integral domain and $F$ is its field of fractions, defined in “topics covered”.

### Topics covered: 3.6

• The section focuses on generalizing the relationship between $\mathbb{Z}$ and $\mathbb{Q}$ (field of fractions) to integral domains in general.

• Definition: Ring $S$ can be imbedded in ring ${S}^{\mathrm{\prime }}$ if there exists an injective ring homomorphism $S\to {S}^{\mathrm{\prime }}$. If both $S$ and ${S}^{\mathrm{\prime }}$ are unital then we also require the imbedding to map $1$ to ${1}^{\mathrm{\prime }}$. If $S$ imbeds in ${S}^{\mathrm{\prime }}$ then ${S}^{\mathrm{\prime }}$ is an extension of $S$.

• Theorem 3.6.1: Every integral domain may be imbedded in a field.

• The theorem is proven by constructing the field of fractions $F=R×\left(R\setminus 0\right)\mathrm{/}\sim$ where $\left[a,b\right]\sim \left[c,d\right]$ if and only if $ad=bc$; here $a,b,c,d\in R$ and $b,d\mathrm{\ne }0$. This is motivated by equivalence of fractions $\frac{a}{b}=\frac{c}{d}$ if and only if $ad=bc$. $R$ then imbeds in $F$ in the natural way, $r↦\left[rx,x\right]$ for any non-zero $x\in R$. The notation $\left[a,b\right]$ is shorthand for $\left[\left(a,b\right)\right]$, the equivalence class under $\sim$ of the element $\left(a,b\right)\in R×\left(R\setminus 0\right)$.

The problems below are paraphrased from/inspired by those given in Topics in Algebra by Herstein. The solutions are my own unless otherwise noted. I will generally try, in my solutions, to stick to the development in the text. This means that problems will not be solved using ideas and theorems presented further on in the book.

### Herstein 3.6.1: Show that multiplication in $F$ is well-defined. (See “topics covered” for definition of notation)

Let $a,{a}^{\mathrm{\prime }},b,{b}^{\mathrm{\prime }},c,{c}^{\mathrm{\prime }},d,{d}^{\mathrm{\prime }}\in R$ with $b,{b}^{\mathrm{\prime }},d,{d}^{\mathrm{\prime }}\mathrm{\ne }0$ and such that $\left[a,b\right]=\left[{a}^{\mathrm{\prime }},{b}^{\mathrm{\prime }}\right]$ and $\left[c,d\right]=\left[{c}^{\mathrm{\prime }},{d}^{\mathrm{\prime }}\right]$ in $F$. In the rational numbers, this would look like $a\mathrm{/}b=1\mathrm{/}2$ and ${a}^{\mathrm{\prime }}\mathrm{/}{b}^{\mathrm{\prime }}=4\mathrm{/}8$.

We have $\left[a,b\right]\left[c,d\right]=ac\mathrm{/}bd$ and $\left[{a}^{\mathrm{\prime }},{b}^{\mathrm{\prime }}\right]\left[{c}^{\mathrm{\prime }},{d}^{\mathrm{\prime }}\right]={a}^{\mathrm{\prime }}{c}^{\mathrm{\prime }}\mathrm{/}{b}^{\mathrm{\prime }}{d}^{\mathrm{\prime }}$. The two products are the same if $ac{b}^{\mathrm{\prime }}{d}^{\mathrm{\prime }}={a}^{\mathrm{\prime }}{c}^{\mathrm{\prime }}bd$. Of course, we have that $a{b}^{\mathrm{\prime }}=b{a}^{\mathrm{\prime }}$ and $c{d}^{\mathrm{\prime }}=d{c}^{\mathrm{\prime }}$ so that $ac{b}^{\mathrm{\prime }}{d}^{\mathrm{\prime }}=b{a}^{\mathrm{\prime }}d{c}^{\mathrm{\prime }}={a}^{\mathrm{\prime }}{c}^{\mathrm{\prime }}bd$ and the multiplication is well-defined (recall that $R$ is commutative because it is an integral domain).

### Herstein 3.6.2: Show that the distributive law holds in $F$.

Let $a,b,c,d,e,f\in R$. We have $\left[a,b\right]\cdot \left(\left[c,d\right]+\left[e,f\right]\right)=\left[a,b\right]\left[cf+de,df\right]=\left[acf+ade,bdf\right]$

and $\left[a,b\right]\cdot \left(\left[c,d\right]+\left[e,f\right]\right)=\left[ac,bd\right]+\left[ae,bf\right]=\left[acbf+aebd,{b}^{2}df\right]\mathrm{.}$

The two forms are equal if $\left(acf+ade\right){b}^{2}df$ and $bdf\left(acbf+aebd\right)$ are equal. Multiplying each out, we see that they are both $acd{b}^{2}{f}^{2}+aef{b}^{2}{d}^{2}$

so that the distributive law holds.

### Herstein 3.6.3: Show that $\varphi :R\to F$ given by $\varphi \left(r\right)=\left[r,1\right]$ is an injective ring homomorphism.

We do not require that $R$ be unital, but rather we define $\left[r,1\right]=\left[rx,x\right]$ for any non-zero $x\in R$. This is well-defined because if $y$ is another non-zero element of $R$, then $\left[rx,x\right]=\left[ry,y\right]$ because $\left(rx\right)y=rxy=\left(ry\right)x$.

Let $a,b,x\in R$ with $x\mathrm{\ne }0$. The map respects ring addition because $\varphi \left(a\right)+\varphi \left(b\right)=\left[ax,x\right]+\left[bx,x\right]=\left[axx+bxx,xx\right]=\left[a+b,1\right]=\varphi \left(a+b\right)$

where we see by the comments above that $\left[rxx,xx\right]=\left[r,1\right]$. It also respects ring multiplication because $\varphi \left(a\right)\varphi \left(b\right)=\left[ax,x\right]\left[bx,x\right]=\left[abxx,xx\right]=\left[ab,1\right]=\varphi \left(ab\right)\mathrm{.}$

Finally, the map is injective: $\varphi \left(a\right)=\varphi \left(b\right)$ means that $\left[ax,x\right]=\left[bx,x\right]$, or $axx=bxx$. This can be rearranged as $\left(a-b\right)xx=0$. Now, $xx\mathrm{\ne }0$ so we must conclude that $a=b$ because $R$ is an integral domain.

### Herstein 3.6.4: Prove that if $K$ is a field containing $R$, then $K$ contains a sub-field isomorphic to $F$.

In $K$, every element $r\in R$ has a multiplicative inverse ${r}^{-1}\in K$ which may or may not exist in $R$. We consider the map $\varphi :F\to K$ given by $\varphi \left(\left[r,s\right]\right)=r{s}^{-1}\mathrm{.}$

We must show that this map is well-defined. If $\left[r,s\right]=\left[t,u\right]$ then $ru=st$. Then we see that $s\left(r{s}^{-1}-t{u}^{-1}\right)=r-st{u}^{-1}=r-ru{u}^{-1}=0$

which has us conclude that $r{s}^{-1}=t{u}^{-1}$, which is the desired statement. The map is also a homomorphism, because $\varphi \left(\left[r,s\right]+\left[t,u\right]\right)=\left(ru+ts\right)\left(su{\right)}^{-1}=r{s}^{-1}+t{u}^{-1}=\varphi \left(\left[r,s\right]\right)+\varphi \left(\left[t,u\right]\right)$

and $\varphi \left(\left[r,s\right]\left[t,u\right]\right)=\left(rt\right)\left(su{\right)}^{-1}=r{s}^{-1}t{u}^{-1}=\varphi \left(\left[r,s\right]\right)\varphi \left(\left[t,u\right]\right)\mathrm{.}$

By now we have exhibited that the field of fractions $F$ is homomorphic to a subset of $K$. Last of all, the map $\varphi$ is injective because $\varphi \left(\left[r,s\right]\right)=\varphi \left(\left[t,u\right]\right)$ means $r{s}^{-1}=t{u}^{-1}$. Rewriting this as $ru=st$, we see that it’s the condition that $\left[r,s\right]$ and $\left[t,u\right]$ are the same equivalence class.

Therefore any field containing $R$ also contains the field of fractions of $R$. In that sense, $F$ is the smallest field containing $R$.

### Herstein 3.6.5*: Let $R$ be a commutative, unital ring and let $S\subset R$ be non-empty. $S$ is a multiplicative system if $0\in S$ and $S$ is closed under multiplication. Define a relation $\sim$ on $R×S$ such that $\left(r,s\right)\sim \left({r}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}\right)$ if there exists ${s}^{\mathrm{\prime }\mathrm{\prime }}\in S$ such that

${s}^{\mathrm{\prime }\mathrm{\prime }}\left(r{s}^{\mathrm{\prime }}-s{r}^{\mathrm{\prime }}\right)=0.$

### (f) Prove that every element in ${R}_{S}$ of the form $\left[{s}_{1},{s}_{2}\right]$ with ${s}_{1},{s}_{2}\in S$ is invertible in ${R}_{S}$.

(a) Let ${r}_{1},{r}_{2},{r}_{3}\in R$ and ${s}_{1},{s}_{2},{s}_{3}\in S$.

1. $\sim$ is reflexive: We have $\left({r}_{1},{s}_{1}\right)\sim \left({r}_{1},{s}_{1}\right)$ because any element ${s}^{\mathrm{\prime }}\in S$ satisfies ${s}^{\mathrm{\prime }}\left({r}_{1}{s}_{1}-{s}_{1}{r}_{1}\right)=0$.

2. $\sim$ is symmetric: Suppose $\left({r}_{1},{s}_{1}\right)\sim \left({r}_{2},{s}_{2}\right)$ so there exists ${s}^{\mathrm{\prime }}\in S$ with ${s}^{\mathrm{\prime }}\left({r}_{1}{s}_{2}-{s}_{1}{r}_{2}\right)=0$. As $R$ is commutative, we also have ${s}^{\mathrm{\prime }}\left({r}_{2}{s}_{1}-{s}_{2}{r}_{1}\right)=0$, which shows that $\left({r}_{2},{s}_{2}\right)\sim \left({r}_{1},{s}_{1}\right)$.

3. $\sim$ is transitive: Suppose $\left({r}_{1},{s}_{1}\right)\sim \left({r}_{2},{s}_{2}\right)$ and $\left({r}_{2},{s}_{2}\right)\sim \left({r}_{3},{s}_{3}\right)$. Then there exist ${s}^{\mathrm{\prime }},{s}^{\mathrm{\prime }\mathrm{\prime }}\in S$ with ${s}^{\mathrm{\prime }}\left({r}_{1}{s}_{2}-{s}_{1}{r}_{2}\right)=0$ and ${s}^{\mathrm{\prime }\mathrm{\prime }}\left({r}_{2}{s}_{3}-{s}_{2}{r}_{3}\right)=0$. This is slightly tricky. We want to combine the two equations in such a way that we get terms ${r}_{1}{s}_{3}$ and ${s}_{1}{r}_{3}$, and the ${r}_{2}$ dependence cancels out. One way to do that is to multiply the first equation by ${s}_{3}{s}^{\mathrm{\prime }\mathrm{\prime }}$ and the second equation by ${s}_{1}{s}^{\mathrm{\prime }}$ and then add the two equations together. This gives ${s}^{\mathrm{\prime }}{s}^{\mathrm{\prime }\mathrm{\prime }}{s}_{2}\left({r}_{1}{s}_{3}-{s}_{1}{r}_{3}\right)=0$

which shows that $\left({r}_{1},{s}_{1}\right)\sim \left({r}_{3},{s}_{3}\right)$.

(b) Let $r,t\in R$ and $s,u\in S$. We define multiplication on ${R}_{S}$ by $\left[r,s\right]\left[t,u\right]=\left[rt,su\right]$ and see that ${R}_{S}$ is closed under it because $su\in S$. Suppose $\left[r,s\right]=\left[{r}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}\right]$ and $\left[t,u\right]=\left[{t}^{\mathrm{\prime }},{u}^{\mathrm{\prime }}\right]$. Then $\left[{r}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}\right]\left[{t}^{\mathrm{\prime }},{u}^{\mathrm{\prime }}\right]=\left[{r}^{\mathrm{\prime }}{t}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}{u}^{\mathrm{\prime }}\right]$

but we can show that $\left(rt,su\right)\sim \left({r}^{\mathrm{\prime }}{t}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}{u}^{\mathrm{\prime }}\right)$. If ${s}_{1},{s}_{2}\in S$ are such that ${s}_{1}\left(r{s}^{\mathrm{\prime }}-s{r}^{\mathrm{\prime }}\right)=0$ and ${s}_{2}\left(t{u}^{\mathrm{\prime }}-u{t}^{\mathrm{\prime }}\right)=0$, then multiplying the first equation by ${s}_{2}t{u}^{\mathrm{\prime }}$ and the second equation by ${s}_{1}s{r}^{\mathrm{\prime }}$ and adding gives the desired result. Therefore multiplication is well-defined.

We define addition on ${R}_{S}$ by $\left[r,s\right]+\left[t,u\right]=\left[ru+st,su\right]$ and again see that ${R}_{S}$ is closed under it because $su\in S$. Suppose again that $\left[r,s\right]=\left[{r}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}\right]$ and $\left[t,u\right]=\left[{t}^{\mathrm{\prime }},{u}^{\mathrm{\prime }}\right]$. Then $\left[{r}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}\right]+\left[{t}^{\mathrm{\prime }},{u}^{\mathrm{\prime }}\right]=\left[{r}^{\mathrm{\prime }}{u}^{\mathrm{\prime }}+{s}^{\mathrm{\prime }}{t}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}{u}^{\mathrm{\prime }}\right]$

and again we can show that $\left({r}^{\mathrm{\prime }}{u}^{\mathrm{\prime }}+{s}^{\mathrm{\prime }}{t}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}{u}^{\mathrm{\prime }}\right)\sim \left(ru+st,su\right)$ with the same technique. Therefore addition is well-defined.

The only other ring axiom to verify is the distributive property, and the proof is identical to that in problem 3.6.2. Thus ${R}_{S}$ is a ring as defined.

(c) This part deserves some commentary. First, we note that this problem (3.6.5) is concerned with generalizing the field of fractions. Before, we took $S=R\setminus 0$ and had an equivalence relation $\left(r,s\right)\sim \left({r}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}\right)$ if $r{s}^{\mathrm{\prime }}-s{r}^{\mathrm{\prime }}=0$. Now we let $S$ be more general and employ a new equivalence relation which makes sense in the context. The other big change is that $R$ may have zero divisors. In fact, if we choose $S$ to have no zero divisors, then the new, more complicated-looking equivalence relation immediately reduces to the old equivalence relation. This problem is foundational for ring localizations.

The question is now whether $R$ imbeds in ${R}_{S}$. It can but need not. If $S$ contains no zero divisors, then the map $f\left(r\right)=\left[rs,s\right]$ for fixed $s\in S$ is an imbedding (i.e. is one-to-one). This is true because $f\left(r\right)\sim f\left({r}^{\mathrm{\prime }}\right)$ means there exists ${s}^{\mathrm{\prime }}$ such that $0={s}^{\mathrm{\prime }}\left(rs-s{r}^{\mathrm{\prime }}\right)=s{s}^{\mathrm{\prime }}\left(r-{r}^{\mathrm{\prime }}\right)$. As $s{s}^{\mathrm{\prime }}\in S$ and $S$ has no zero divisors, this implies $r={r}^{\mathrm{\prime }}$ so the map $f$ is injective.

On the other hand, if $S$ contains a zero divisor then it may not imbed. We can build some intuition by studying a particular example, such as $R=\mathbb{Z}\mathrm{/}6\mathbb{Z}$ with $S=\left\{2,4\right\}$. In that case, ${R}_{S}=\left\{\left[0,2\right],\left[1,4\right],\left[1,2\right]\right\}$; all other elements fall into these equivalence classes. We see that ${R}_{S}$ has $3$ elements – it is smaller than $R$, so $R$ clearly does not imbed in ${R}_{S}$! The zero divisors in $S$ cause some of the fractions to “reduce” in ways that we would not expect.

Judging by the content of the subsequent parts of the problem, it doesn’t seem that a proof of the general criterion for imbedding is called for. It appears to suffice to show that there are situations where $R$ imbeds in ${R}_{S}$ but also situations where it does not.

(d) Let $r,{r}^{\mathrm{\prime }}\in R$ and fix $s\in S$. We have $\varphi \left(r\right)\varphi \left({r}^{\mathrm{\prime }}\right)=\left[rs,s\right]\left[{r}^{\mathrm{\prime }}s,s\right]=\left[r{r}^{\mathrm{\prime }}ss,ss\right]$. Now, $\varphi \left(r{r}^{\mathrm{\prime }}\right)=\left[r{r}^{\mathrm{\prime }}s,s\right]\sim \left[r{r}^{\mathrm{\prime }}ss,ss\right]=\varphi \left(r\right)\varphi \left({r}^{\mathrm{\prime }}\right)$ because $\left(r{r}^{\mathrm{\prime }}s\right)\left(ss\right)-s\left(r{r}^{\mathrm{\prime }}ss\right)=0$. Also, $\varphi \left(r\right)+\varphi \left({r}^{\mathrm{\prime }}\right)=\left[rs,s\right]+\left[{r}^{\mathrm{\prime }}s,s\right]=\left[\left(r+{r}^{\mathrm{\prime }}\right)s,s\right]=\varphi \left(r+{r}^{\mathrm{\prime }}\right)\mathrm{.}$

Hence $\varphi$ is a homomorphism from $R$ to ${R}_{S}$. The kernel is the set of elements $r$ with $\varphi \left(r\right)\sim 0$. That is, those elements for which there exists ${s}^{\mathrm{\prime }}\in S$ with ${s}^{\mathrm{\prime }}sr=0$. Recalling the notation of previous sections, we see that $\mathrm{ker}\varphi =\bigcup _{s\in S}\mathrm{A}\mathrm{n}\mathrm{n}\left(s\right)\mathrm{.}$

(e) If $r\in \mathrm{ker}\varphi$, then there exists $s\in S$ with $sr=0$ by the comments of part (d). On the other hand, we know that $S$ is closed under multiplication and $0\in S$, so if $r$ were in $S$ then $sr\mathrm{\ne }0$. Hence $\mathrm{ker}\varphi \cap S$ is empty.

(f) The multiplicative identity element in ${R}_{S}$ is $\left[s,s\right]$ for any $s\in S$. We see this is true because $\left[r,s\right]\cdot \left[{s}^{\mathrm{\prime }},{s}^{\mathrm{\prime }}\right]=\left[r{s}^{\mathrm{\prime }},s{s}^{\mathrm{\prime }}\right]\sim \left[r,s\right]$ since $\left(r{s}^{\mathrm{\prime }}\right)s-\left(s{s}^{\mathrm{\prime }}\right)r=0$. As $\left[r,s\right]$ is some sort of generalization of the rational number $r\mathrm{/}s$, the natural thing to consider is $\left[{s}_{1},{s}_{2}\right]\cdot \left[{s}_{2},{s}_{1}\right]=\left[{s}_{1}{s}_{2},{s}_{1}{s}_{2}\right]\mathrm{.}$

But ${s}_{1}{s}_{2}\in S$ and, as we have just seen, this is the identity element! Hence any element $\left[{s}_{1},{s}_{2}\right]$ with ${s}_{1},{s}_{2}\in S$ is invertible in ${R}_{S}$.

In this way, we have created something like the field of fractions. However, with the looser constraints on $R$ and $S$, we end up having only some subset of the construction being invertible.

### Herstein 3.6.6: Let $R$ be an integral domain and let $a,b\in R$ be such that ${a}^{n}={b}^{n}$ and ${a}^{m}={b}^{m}$ for $m,n\in \mathbb{Z}$ positive and coprime. Show that $a=b$.

The subject of the section is the field of fractions of an integral domain, which is a big hint. We know that $m,n$ coprime implies that there exist $\alpha ,\beta \in \mathbb{Z}$ such that $\alpha m+\beta n=1$. The natural thing to do is say that $a={a}^{\alpha m+\beta n}={b}^{\alpha m+\beta n}=b$. If $\alpha$ and $\beta$ are both positive then this is fine, but they cannot be as we will see momentarily. Negative powers are tricky in $R$ because we do not assume elements to have multiplicative inverses.

If $m$ or $n$ is $1$ then the result is trivially true, so we will consider $m,n>1$ which implies that (1) neither $\alpha$ nor $\beta$ is zero, and (2) one of $\alpha$ or $\beta$ is positive and the other is negative. We may assume without loss of generality that $\alpha >0$ and $\beta <0$. Then we prefer to write $\alpha m-\mathrm{\mid }\beta \mathrm{\mid }n=1.$

In the field of fractions $F$ of $R$, we may state the fact that $\left[{a}^{\alpha m},{a}^{\mathrm{\mid }\beta \mathrm{\mid }n}\right]=\left[{b}^{\alpha m},{b}^{\mathrm{\mid }\beta \mathrm{\mid }n}\right]$

which follows directly from the hypotheses of the problem. This implies that ${a}^{\alpha m}{b}^{\mathrm{\mid }\beta \mathrm{\mid }n}={a}^{\mathrm{\mid }\beta \mathrm{\mid }n}{b}^{\alpha m}\mathrm{.}$

Now use that $\mathrm{\mid }\beta \mathrm{\mid }n=\alpha m-1$ and subtract to find ${a}^{\alpha m-1}{b}^{\alpha m-1}\left(a-b\right)=0.$

Because $R$ is an integral domain, we have either $a=b$ or ${a}^{\alpha m-1}{b}^{\alpha m-1}=0$. The latter case is easily shown to imply that $a=b=0$. Therefore, $a=b$.

### Herstein 3.6.7: Let $R$ be a ring (not necessarily commutative) in which $xy=0$ implies $x=0$ or $y=0$. Let $a,b\in R$ be such that ${a}^{n}={b}^{n}$ and ${a}^{m}={b}^{m}$ for $m,n\in \mathbb{Z}$ positive and coprime. Show that $a=b$.

We have no notion of field of fractions or even ring localization (problem 3.6.5) for an arbitrary ring. However, the argument presented above was only suggested by the field of fractions and didn’t actually rely on the construction. Therefore we again take $\alpha ,\beta \in \mathbb{Z}$ as before, with $\alpha >0$ and $\beta <0$ and $\alpha m-\mathrm{\mid }\beta \mathrm{\mid }n=1$. We have ${a}^{\alpha m}{b}^{\mathrm{\mid }\beta \mathrm{\mid }n}={a}^{\alpha m}{a}^{\mathrm{\mid }\beta \mathrm{\mid }n}={a}^{\mathrm{\mid }\beta \mathrm{\mid }n}{a}^{\alpha m}={a}^{\mathrm{\mid }\beta \mathrm{\mid }n}{b}^{\alpha m}\mathrm{.}$

As before, we use that $\mathrm{\mid }\beta \mathrm{\mid }n=\alpha m-1$ and subtract to find ${a}^{\alpha m-1}\left(a-b\right){b}^{\mathrm{\mid }\beta \mathrm{\mid }n-1}=0$

which again forces us to conclude that $a=b$.