I will try to not make this too detailed.
A good simple approximation is $$ \text{entropy}(r) \approx \log_2{r!} - \log_2(e) + 1 $$
entropyApprox[r_]:= Log2[r!] - Log2[E] +1
While a more complex approximation that does very well is $$ \begin{align} \text{entropy}(r) & \approx \frac{\zeta ^{(1,0)}\left(\frac{1}{2},r!+1\right)-\log (4) \zeta \left(\frac{1}{2},r!+1\right)+2 \sqrt{r!} \log (r!)+C}{4 \log (2) \sqrt{r!}}\\ &\text{with}\\ C & = -\frac{1}{4} \zeta \left(\frac{1}{2}\right) \left(\pi +2 \gamma +\log \left(\frac{\pi ^2}{4}\right)\right)-\frac{1}{2} \left(3 \sqrt{2}+4\right) \log (2)+4 \left(\sqrt{2}-1\right) \sinh ^{-1}(1) \end{align} $$
Where $\zeta(a)$ is the Riemann zeta function and $\zeta(a,b)$ is the Hurwitz zeta function.
constK=1/4 (-16 ArcSinh[1]+16 Sqrt[2] ArcSinh[1]-8 Log[2]-6 Sqrt[2] Log[2]-2 EulerGamma Zeta[1/2]-\[Pi] Zeta[1/2]+2 Log[2] Zeta[1/2]-2 Log[\[Pi]] Zeta[1/2]); entropyAsymptotic2[r_] := 1/(4 Log[2] Sqrt[r!]) (constK - HurwitzZeta[1/2, 1 + r!] Log[4] + 2 Sqrt[r!] Log[r!] + Derivative[1,0][Zeta][1/2,1+r!])
Observe that we can rewrite Dist to the succesive differences of square roots divided by the square root of r!
Dist[r_] := Dist[r] = 1/Sqrt[r!] (Sqrt[#] - Sqrt[# - 1]) & /@ Range[r!]
And also Dist always sums to 1, so normalization is not necessary
1/Sqrt[r!] Sum[(Sqrt[k] - Sqrt[k - 1]), {k, r!}] (*1*)
So we can therefore write entropy as
entropy[r_] := entropy[r] = -(1/Sqrt[r!]) Sum[(Sqrt[k] - Sqrt[k - 1]) Log2[(Sqrt[k] - Sqrt[k - 1])/ Sqrt[r!]], {k, r!}]
Observe we can expand the log part since all the arguments are positive integers producing
ClearAll[entropy] entropy[r_] := entropy[r] = 1/2 Log2[r!] - 1/Sqrt[r!] Sum[((-Sqrt[-1 + k] + Sqrt[k]) Log[-Sqrt[-1 + k] + Sqrt[ k]])/Log[2], {k, r!}]
$$ \text{entropy}(r)=\frac{1}{2} \log _2(r!)-\frac{1}{\sqrt{r!}}\sum _k^{r!} \frac{\left(\sqrt{k}-\sqrt{k-1}\right) \log \left(\sqrt{k}-\sqrt{k-1}\right)}{\log (2)} $$
We can get the series expansion for the summand as the summation variable $k$ grows
summand[k_] := ((-Sqrt[-1 + k] + Sqrt[k]) Log[-Sqrt[-1 + k] + Sqrt[ k]])/Log[2]; asymSummand[k_] := Evaluate@Assuming[k > 0, Series[summand[k], {k, Infinity, 1}]] // Normal // Simplify;
This captures the behavior the summand well for $k \geq 3$, but not the first non-zero term $k=2$, so I sum the asymptotic summand starting at $k=3$ and add the $k =2$ term separately in terms of the original summand
(*term at k =1 is 0. k =2 is first nonzero term and is not captured \ well by the limit. So we sum starting at k =3*) (*and add the k =2 term*) asymSum[kMax_] := Sum[asymSummand[k], {k, 3, kMax}] + summand[2] // Evaluate (*((-1+Sqrt[2]) Log[-1+Sqrt[2]])/Log[2]+(4 HurwitzZeta[1/2,1+kMax] Log[4]+Sqrt[2] Log[64]+Log[256]+2 EulerGamma Zeta[1/2]+\[Pi] Zeta[1/2]-4 Log[4] Zeta[1/2]+Log[64] Zeta[1/2]+2 Log[\[Pi]] Zeta[1/2]-4 (Zeta^(1,0)) [1/2,1+kMax])/(4 Log[16])*)
Which means asymptotically entropy looks like:
entropyAsymptotic[r_] := FullSimplify[1/2 Log2[r!] - 1/Sqrt[r!] asymSum[r!], r \[Element] PositiveIntegers] // Evaluate (* (16 (-1+Sqrt[2]) ArcSinh[1]-Sqrt[2] Log[64]-Log[256]-HurwitzZeta[1/2,1+r!] Log[256]+8 Sqrt[r!] Log[r!]-(2 EulerGamma+\[Pi]+Log[\[Pi]^2/4]) Zeta[1/2]+4 (Zeta^(1,0))[1/2,1+r!])/(16 Sqrt[r!] Log[2]) *)
Which is way too complicated. But notice if we Expand the above and select the parts that have constant numerator and $\sqrt{r!}$ in the denominator (which will asymptotically go to 0), and set aside everything else we can get the terms that might grow with $r$
split = PowerExpand /@ (List @@ Expand@entropyAsymptotic[r]); goesToZero = Select[split, MemberQ[Denominator@#, Sqrt[r!]] && FreeQ[Numerator@#, r] &]; everythingElse = Complement[split, goesToZero] (* {-(HurwitzZeta[1/2,1+r!]/(2 Sqrt[r!])), Log[r!]/(2 Log[2]), (Zeta^(1,0))[1/2,1+r!]/(4 Sqrt[r!] Log[2])} *)
And plotting graphically these candidate terms, we see the 2nd and 3rd terms dominate, and also that the 2nd term approximates the third term
Plot[Evaluate@everythingElse, {r, 0, 1000}, PlotLegends -> "Expressions", PlotRange -> All]

And the first term which doesn't grow as fast approaches 1:
Limit[First@everythingElse, r -> Infinity] (*1*)
Note for large r that the difference between the 3rd and 2nd term approaches $\log_2(e) \approx 1.44$
Plot[(everythingElse[[2]] - everythingElse[[3]]) - Log2[E], {r, 200, 1000}, PlotRange -> All, Frame -> True, FrameLabel -> {"r", "difference"}, PlotLegends -> (everythingElse[[2]] - everythingElse[[3]]) - HoldForm[Log2[E]]]

Which means that
$$ \text{entropy}(r) \approx \log_2{r!} - \log_2(e) + 1 $$
entropyApprox[r_]:= Log2[r!] - Log2[E] +1
Notice (for what it's worth) we can also define entropy recursively so the sum only runs from (r-1)!+1 to r! instead of starting the sum at 1 each time:
ClearAll[entropyRecursive] entropyRecursive[1] = 0; entropyRecursive[r_] := entropyRecursive[r] = 1/Sqrt[r] (entropyRecursive[r - 1] - 1/2 Log2[(r - 1)!]) - 1/Sqrt[r!] Sum[((-Sqrt[-1 + k] + Sqrt[k]) Log[-Sqrt[-1 + k] + Sqrt[ k]])/Log[2], {k, (r - 1)! + 1, r!}] + 1/2 Log2[r!]
(small add-on note: entropyRecursive stores exact values, which can get huge in terms of LeafCount. If you want approximate values change the initial term to entropyRecursive[1] = 0.. You can also replace Sum with NSum to get really fast computation but you will need to Check each output or something similar as eventually NSum will run into trouble.)
And now we can compare entropy, entropyAsymptotic and entropyApprox a little quicker (though it is still slow)
compare = DiscretePlot[{entropyRecursive[r], entropyAsymptotic[r], entropyApprox[r]}, {r, 9}, Joined -> True, Filling -> None, Frame -> True, FrameLabel -> {"r", "entropy(r)"}, PlotLegends -> "Expressions"] compareError = DiscretePlot[{entropyAsymptotic[r] - entropyRecursive[r], entropyApprox[r] - entropyRecursive[r]}, {r, 9}, Joined -> True, Filling -> None, Frame -> True, FrameLabel -> {"r", "error"}, PlotLegends -> "Expressions", PlotLabel -> "Error"]

