2
$\begingroup$

I'm doing an exercise in Dummit book "Abstract Algebra" and stuck on this problem. I'm so grateful if anyone can help me solve this. Thanks so much.

Prove that $SL_{2}(F_{3})$ is the subgroup of $GL_{2}(F_{3})$ generated by $\begin{pmatrix}1 & 1 \\ 0 & 1\end{pmatrix}$ and $\begin{pmatrix}1 & 0 \\ 1 & 1\end{pmatrix}$. Can this problem be generalized to $F_{p}$ or $GL_{n}(F_{p})$? Here $F_{p}$ is the field $\mathbb Z/p\mathbb Z$ with $p$ a prime number.

$\endgroup$
3
  • 2
    $\begingroup$ $SL_2(F_3)$ is a finite group, so there should be nothing preventing you from writing down the list of its elements and checking that those two elements generate the full group. $\endgroup$ Commented May 6, 2013 at 15:03
  • $\begingroup$ Yes, you're right. But I want to find the solution which can be generalized and meaningful for another problem... $\endgroup$ Commented May 6, 2013 at 16:13
  • $\begingroup$ @leducquang: let me know if you want something different in the answer. I tried to be explicit for the case SL2, but only using ideas that (more or less) work for all semisimple algebraic groups over fields. SL3 can also be done fairly explicitly, though there are 6 zero-patterns, so it might be a little dull to write them all out. $\endgroup$ Commented May 6, 2013 at 17:10

2 Answers 2

6
$\begingroup$

Here's one alternative solution. Let G be the subgroup generated by your two matrices and look at the 8 nonzero vectors that G acts on. See that this action is transitive, so the order of G is a multiple of 8. Each of your matrices has order 3, so the order of G is a multiple of 3. Thus the order of G is a multiple of 24. But $SL_{2}(F_{3})$ has 24 elements, so G is $SL_{2}(F_{3})$.

$\endgroup$
2
  • $\begingroup$ Sorry, I don't get what you mean? What is 8 nonzero vector you mean and in what way G acts on 8 vector. Please clarify more, because I'm new in abstract algebra. Thanks $\endgroup$ Commented May 7, 2013 at 15:21
  • $\begingroup$ A matrix 'times' a column vector gives a column vector. The possible column vectors are just pairs of elements from the field with 3 elements. This gives a total of 9 possible column vectors, one of which is (0,0), the others are the 8 nonzero vectors. Then look at your matrices, and see which vectors get mapped to which vectors by each matrix. Then work out that through some combination of application of your generating matrices, any nonzero vector can get mapped to any other nonzero vector. $\endgroup$ Commented May 8, 2013 at 3:05
3
$\begingroup$

These are called the Steinberg generators, and are defined for commutative rings (and larger groups than just $\operatorname{SL}_2$). I'll cover $\operatorname{SL}_2(K)$ for arbitrary fields $K$. Basically, we try to mimic the PLU decomposition of Gaussian elimination (but I write it as UDPU out of habit).

Denote $$x(t) = \begin{bmatrix} 1 & t \\ 0 & 1 \end{bmatrix} \quad \text{and} \quad y(t) = \begin{bmatrix} 1 & 0 \\ t & 1 \end{bmatrix}$$ where $t$ comes from some ring. Then verify $x(s) \cdot x(t) = x(s+t)$, $y(s) \cdot y(t) = y(s+t)$ and $$w = x(1)\cdot y(-1)\cdot x(1) = \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}$$ and finally if $t$ is an invertible element of the ring, then $$h(t) =x(-1)\cdot y(1) \cdot x(-1) \cdot x(1/t)\cdot y(-t) \cdot x(1/t) = \begin{bmatrix} t & 0 \\0 & t^{-1} \end{bmatrix}$$

We now use Gaussian elimination to write an arbitrary matrix of determinant 1 as $$BwU = \left\{ x(rs)\cdot h(-s)\cdot w \cdot x(ts) = \begin{bmatrix} r & (rt-1)\cdot s \\ s^{-1} & t \end{bmatrix} : r,s,t \in K, s\in K^\times \right\}$$ or $$B = \left\{ x(rs)\cdot h(r) = \begin{bmatrix} r & s \\ 0 & r^{-1} \end{bmatrix} : r,s \in K, r \in K^\times \right\}$$

The separate cases based on the zero-pattern (bottom-left entry is 0 or not) is a common feature of the Steinberg generators; the guys $BwU$ and $B$ are called the double cosets in the Bruhat decomposition or the Schubert cells. The elements $w$ (there are more than one for larger matrices) switch patterns. The various $x$ (again, there are more than one of them for larger matrices) take care of the strictly upper triangular part, and the $h$ take care of the diagonal.

If the ring has non-invertible, non-zero elements then more care is needed, but in your case this is enough:

Since the ring is generated additively by $1$, we get $x(t) = x(1)^t$ and $y(t)=y(1)^t$. Since every ring element is either 0 or invertible, the two patterns given handle all matrices of determinant 1.

$\endgroup$
1
  • $\begingroup$ Thanks so much for your answer. But there's something in your answer which I don't get. Maybe I need to research more to understand them, for example PLU decomposition of Gaussian elimination. Thanks anyway :-) $\endgroup$ Commented May 6, 2013 at 17:28

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.