Can you help me on verifying if this computation of entropy is correct and on understanding its meaning?
I am not sure of the result especially because it is equal to 0: it means that we cannot transmit any bits/s on that channel. Why?
Since all transitions are equally likely, there is no information in the channel output about the source. Hence, the mutual information, and, consequently, the channel capacity, are zero in this case. You might as well look at the output of a random number generator and from that guess the source symbols.
Expressed in formulas you have
$$H(Y)=H(Y|X)\tag{1}$$
because $X$ and $Y$ are independent, and, consequently,
$$I(X,Y)=H(Y)-H(Y|X)=0\tag{2}$$
i.e., there is no information in $Y$ about $X$ (which makes sense because the two random variables are independent).
Since mutual information is symmetrical, i.e., $I(X,Y)=I(Y,X)$ we also have $H(X)-H(X|Y)=0$, i.e.,
$$H(X)=H(X|Y)\tag{3}$$
which means that the uncertainty about the outcome of $X$ is the same whether or not the channel output $Y$ is given.