Although a good answer was already given and accepted, the following is also relevant. The simple two-level model you used is:
$y_{ij} = b_0+u_j+e_{ij}$
Here, $u_j$ denotes the random influence of "site". If this model indeed completely represents the data generation of your $y$ variable, then we could wonder how high the correlation is of two randomly chosen $y$ values, say $y_{kj}$ and $y_{mj}$, from the same randomly chosen site $j$. The two $y$ values have something in common, namely $b_0$, which is constant (no variance) and $u_j$, which has variance! As a result (see proof below) of the common $u_j$ term, the covariance $cov(y_{kj},y_{mj})$ of $y_{kj}$ and $y_{mj}$ is equal to the variance $\sigma_u^2$ of $u_j$.
Now look at the correlation. The linear correlation of two variables, here $y_{kj}$ and $y_{mj}$, is defined as their covariance divided by the product of their two standard deviations. Here, the variance of $y_{kj}$ equals $\sigma_u^2 + \sigma_e^2$, and the same is true for the variance of $y_{mj}$. So, the standard deviations of $y_{kj}$ and $y_{mj}$ are equal, namely $\sqrt{\sigma_u^2 + \sigma_e^2}$. This means that the product of these two standard deviation equals $\sigma_u^2 + \sigma_e^2$. Finally, dividing the covariance by the product of the two standard deviations results in the well known formula for the ICC:
$ICC=\dfrac{\sigma_u^2}{\sigma_u^2 + \sigma_e^2}$
So, the ICC formula correlation indeed expresses two things: (1) the proportion variance "explained" by the grouping factor "site", and (2) the correlation between two random $y$ draws from a randomly chosen site.
Proof that $cov(y_{kj},y_{mj}) = \sigma_u^2$
First note that the expectation of $y_{kj}$ equals $E(y_{kj})=E(b_0+u_j+e_{kj})=b_0$ because $E(u_j)=E(e_{kj})=0$ by assumption.
The covariance of $y_{kj}$ and $y_{mj}$ is by definition:
$cov(y_{kj}, y_{mj}) = E(y_{kj}-E(y_{kj}))(y_{mj}-E(y_{mj})) = E(y_{kj}-b_0)(y_{mj}-b_0)$.
This can be written as:
$E(b_0+u_j+e_{kj}-b_0)(b_0+u_j+e_{mj}-b_0)=E(u_j+e_{kj})(u_j+e_{mj})=$
$E(u_j^2) + E(u_j\times{e_{mj}}) + E(e_{kj}\times{u_j}) + E(e_{kj}\times{e_{mj}})$
The last three terms in the above expression are zero. This is, because $u_j$ and $e_{jk}$ are assumed to be independent, so it must hold that $E(u_j\times{e_{mj}}) = E(u_j)\times{E(e_{mj})} = 0 \times 0$. Similarly, $E({e_{kj}\times u_j}) = 0 \times 0$. Also, $e_{kj}$ and $e_{mj}$ are assumed to be independent, and hence $E(e_{kj}\times{e_{mj}}) =E(e_{kj})\times{E(e_{mj})}=0 \times 0$.
Finally, the variance $\sigma_u^2$ of $u_j$ is by definition:
$\sigma_u^2 = E(u_j-E(u_j))^2 = E(u_j-0)^2 = E(u_j^2)$
and this means that
$cov(y_{kj},y_{mj})=\sigma_u^2$