If I have a random key accumulating $N$ bits of entropy, I’d expect to brute-force it in the order of $2^N$ trials.
That's the right order of magnitude. For uniform distribution, the expected number of failed trials is $(2^N-1)/2$. For non-uniform distribution, that can be less by a small factor, or more, by some much larger factor. I assume $N$ is the Shannon entropy in bit.
The method leading to the least expected number of failed trials is to make guesses from most to least probable key. For a key of $k=4$ symbols with 1 having probability $0\le p\le 1/2$ and 0 probability $1-p$, a suitable order for the $2^k$ candidates is
0000 0001 0010 0100 1000 0011 0101 0110 1001 1010 1100 0111 1011 1101 1110 1111 The expected number of failed trials before hitting the right key is $$\begin{array}{lll} &(1+2+3+4)\,p(1-p)^3\\ \quad+&(5+6+7+8+9+10)\,p^2(1-p)^2\\ \quad+&(11+12+13+14)\,p^3(1-p)\\ \quad+&15\,p^4\\ =&10p+15p^2-10p^3\end{array}$$ Note: to extend to $0\le p\le 1$, change $p$ to $\min(p,1-p)$.
Plotting that expected number of failed attempts against the entropy-based approximation $(2^N-1)/2$ with Shannon entropy (in bit) $N=-k(p\,\log_2(p)+(1-p)\log_2(1-p))$: 
Thus for $k=4$ the entropy-based approximation is fine, and always by excess. But it turns our that changes for more bits, e.g. for $k=32$: 
ThePlotting the ratio actual/expected for various $k$ (with log scales): 
As apparent in these examples, the actual expected number of failed trials can get quitearbitrarily higher than the Shannon entropy estimate suggests (e.g. for $k=80$$k=128$, $p=1/21$, it's $>60000$$>120.000.000$ times higher). But whatever the distribution of the keys it never gets much lower. As pointed by kodlu answering my reformulation in more mathematical terms on Math-SE, a lower bound or essentially $1/2$ is given in J.L. Massey's Guessing and entropy, in proceedings of ISIT 1994.