Just some hints.
We can consider this as an additive gaussian channel, with noise $Z$, bounded input $X$ and output $W=X+Z$. Furthermore, $h(W) = h(Z) + h(X) - h(X|W)=h(Z)+I(X;W)$
For the first equality see here. Because $h(Z)$ is fixed, our problem of maximizing $h(W)$ is then equivalent to finding the pdf for $X$ that maximizes the mutual information, that is, to find the capacity of this channel. This is not trivial - see eg the introduction of this paper.
To get some bound, I'd try with the distribution that achieves that capacity in some cases: two Dirac deltas at $X=\pm A$ with same weight. In this case, the pdf of $W$ would be the mix of two Gaussians, and its entropy is studied here (no simple close result, but you might get some useful bound).