I have been given a 4x4 matrix and each entry is an 8-bit long byte. I have been given the 4x4 matrix that I need to multiply each column by of the first matrix by.
$$\begin{pmatrix}2 & 3 & 1 & 1\\\ 1 & 2 & 3 & 1\\ 1 & 1 & 2 & 3\\ 3 & 1 & 1 & 2\end{pmatrix}$$
The method that I have been told to use seems confusing. I have searched around the internet and found many examples with polynomials but I'm not supposed to use this method.
All I have been told is:
Multiplying by 01 does nothing Multiplying by 02 corresponds to multiplying the byte by 10 (because 2 is 10 in binary). This is simply a left shift by one place. If the left-hand bit of the original byte was 1, then XOR the answer with 100011011. Multiplying by 03 corresponds to multiplying the byte by 11 (because 3 is 11 in binary). This is a left shift, followed by an XOR with the original byte. Again, if the left-hand bit of the original byte was 1, then XOR the answer with 100011011.
I find the XOR bit strange as well - because that is a string of 9 bits and the bytes are only 8.
Can anyone make sense of this or point me in the direction of a good example?