3
$\begingroup$

Suppose, we have the following code:

Clear["Global`*"] F[r_] := F[r] = Sort[N[Sqrt[Range[0, r!]/(r!)]]] Dist[r_] := Dist[r] = Differences[F[r]] normalize[r_] := normalize[r] = Dist[r]/Total[Dist[r]] entropy[r_] := entropy[r] = Simplify[Total[-normalize[r] Log2[normalize[r]]]] k = 10 T = Table[{r, entropy[r]}, {r, 1, k}] 

Notice, the output is:

{{1, 0.}, {2, 0.872429}, {3, 2.32478}, {4, 4.23358}, {5, 6.50504}, {6,9.06583}, {7,11.8628}, {8, 14.8587}, {9, 18.0272}, {10, 21.3486}} 

I guessed the asymptotic expansion of this table of values is $\log_{2}(x!)-.5$

The error gives the following:

error = Table[{r, N[(Log2[r!] - .5) - entropy[r]]}, {r, 1, k}] {{1, -0.5}, {2, -0.372429}, {3, -0.239813}, {4, -0.14862}, {5, -0.0981457}, {6,-0.0739782}, {7, -0.0636069}, {8, -0.059533}, {9, -0.0580476}, {10, -0.0575398}} 

However, it seems to be slowing down before it reaches zero.

Question: What is the asymptotic expansion of T as k$\to\infty$.

(I don't need an in detail explanation. I just need to confirm whether I'm correct or find an accurate asymptotic expansion whose error approaches zero.)

Edit: Using NonlinearModelFit, I assume:

$$\log_{2}(r!)-.5-3/r\le \text{entropy}(r)\le\log_{2}(x!)-.5$$

Hence, since $3/r\to 0$

$$\text{entropy}(r)\sim\log_{2}(x!)-.5$$

$\endgroup$

3 Answers 3

5
$\begingroup$

I will try to not make this too detailed.

A good simple approximation is $$ \text{entropy}(r) \approx \log_2{r!} - \log_2(e) + 1 $$

entropyApprox[r_]:= Log2[r!] - Log2[E] +1 

While a more complex approximation that does very well is $$ \begin{align} \text{entropy}(r) & \approx \frac{\zeta ^{(1,0)}\left(\frac{1}{2},r!+1\right)-\log (4) \zeta \left(\frac{1}{2},r!+1\right)+2 \sqrt{r!} \log (r!)+C}{4 \log (2) \sqrt{r!}}\\ &\text{with}\\ C & = -\frac{1}{4} \zeta \left(\frac{1}{2}\right) \left(\pi +2 \gamma +\log \left(\frac{\pi ^2}{4}\right)\right)-\frac{1}{2} \left(3 \sqrt{2}+4\right) \log (2)+4 \left(\sqrt{2}-1\right) \sinh ^{-1}(1) \end{align} $$

Where $\zeta(a)$ is the Riemann zeta function and $\zeta(a,b)$ is the Hurwitz zeta function.

constK=1/4 (-16 ArcSinh[1]+16 Sqrt[2] ArcSinh[1]-8 Log[2]-6 Sqrt[2] Log[2]-2 EulerGamma Zeta[1/2]-\[Pi] Zeta[1/2]+2 Log[2] Zeta[1/2]-2 Log[\[Pi]] Zeta[1/2]); entropyAsymptotic2[r_] := 1/(4 Log[2] Sqrt[r!]) (constK - HurwitzZeta[1/2, 1 + r!] Log[4] + 2 Sqrt[r!] Log[r!] + Derivative[1,0][Zeta][1/2,1+r!]) 

Observe that we can rewrite Dist to the succesive differences of square roots divided by the square root of r!

Dist[r_] := Dist[r] = 1/Sqrt[r!] (Sqrt[#] - Sqrt[# - 1]) & /@ Range[r!] 

And also Dist always sums to 1, so normalization is not necessary

1/Sqrt[r!] Sum[(Sqrt[k] - Sqrt[k - 1]), {k, r!}] (*1*) 

So we can therefore write entropy as

entropy[r_] := entropy[r] = -(1/Sqrt[r!]) Sum[(Sqrt[k] - Sqrt[k - 1]) Log2[(Sqrt[k] - Sqrt[k - 1])/ Sqrt[r!]], {k, r!}] 

Observe we can expand the log part since all the arguments are positive integers producing

ClearAll[entropy] entropy[r_] := entropy[r] = 1/2 Log2[r!] - 1/Sqrt[r!] Sum[((-Sqrt[-1 + k] + Sqrt[k]) Log[-Sqrt[-1 + k] + Sqrt[ k]])/Log[2], {k, r!}] 

$$ \text{entropy}(r)=\frac{1}{2} \log _2(r!)-\frac{1}{\sqrt{r!}}\sum _k^{r!} \frac{\left(\sqrt{k}-\sqrt{k-1}\right) \log \left(\sqrt{k}-\sqrt{k-1}\right)}{\log (2)} $$

We can get the series expansion for the summand as the summation variable $k$ grows

summand[k_] := ((-Sqrt[-1 + k] + Sqrt[k]) Log[-Sqrt[-1 + k] + Sqrt[ k]])/Log[2]; asymSummand[k_] := Evaluate@Assuming[k > 0, Series[summand[k], {k, Infinity, 1}]] // Normal // Simplify; 

This captures the behavior the summand well for $k \geq 3$, but not the first non-zero term $k=2$, so I sum the asymptotic summand starting at $k=3$ and add the $k =2$ term separately in terms of the original summand

(*term at k =1 is 0. k =2 is first nonzero term and is not captured \ well by the limit. So we sum starting at k =3*) (*and add the k =2 term*) asymSum[kMax_] := Sum[asymSummand[k], {k, 3, kMax}] + summand[2] // Evaluate (*((-1+Sqrt[2]) Log[-1+Sqrt[2]])/Log[2]+(4 HurwitzZeta[1/2,1+kMax] Log[4]+Sqrt[2] Log[64]+Log[256]+2 EulerGamma Zeta[1/2]+\[Pi] Zeta[1/2]-4 Log[4] Zeta[1/2]+Log[64] Zeta[1/2]+2 Log[\[Pi]] Zeta[1/2]-4 (Zeta^(1,0)) [1/2,1+kMax])/(4 Log[16])*) 

Which means asymptotically entropy looks like:

entropyAsymptotic[r_] := FullSimplify[1/2 Log2[r!] - 1/Sqrt[r!] asymSum[r!], r \[Element] PositiveIntegers] // Evaluate (* (16 (-1+Sqrt[2]) ArcSinh[1]-Sqrt[2] Log[64]-Log[256]-HurwitzZeta[1/2,1+r!] Log[256]+8 Sqrt[r!] Log[r!]-(2 EulerGamma+\[Pi]+Log[\[Pi]^2/4]) Zeta[1/2]+4 (Zeta^(1,0))[1/2,1+r!])/(16 Sqrt[r!] Log[2]) *) 

Which is way too complicated. But notice if we Expand the above and select the parts that have constant numerator and $\sqrt{r!}$ in the denominator (which will asymptotically go to 0), and set aside everything else we can get the terms that might grow with $r$

split = PowerExpand /@ (List @@ Expand@entropyAsymptotic[r]); goesToZero = Select[split, MemberQ[Denominator@#, Sqrt[r!]] && FreeQ[Numerator@#, r] &]; everythingElse = Complement[split, goesToZero] (* {-(HurwitzZeta[1/2,1+r!]/(2 Sqrt[r!])), Log[r!]/(2 Log[2]), (Zeta^(1,0))[1/2,1+r!]/(4 Sqrt[r!] Log[2])} *) 

And plotting graphically these candidate terms, we see the 2nd and 3rd terms dominate, and also that the 2nd term approximates the third term

Plot[Evaluate@everythingElse, {r, 0, 1000}, PlotLegends -> "Expressions", PlotRange -> All] 

enter image description here

And the first term which doesn't grow as fast approaches 1:

Limit[First@everythingElse, r -> Infinity] (*1*) 

Note for large r that the difference between the 3rd and 2nd term approaches $\log_2(e) \approx 1.44$

 Plot[(everythingElse[[2]] - everythingElse[[3]]) - Log2[E], {r, 200, 1000}, PlotRange -> All, Frame -> True, FrameLabel -> {"r", "difference"}, PlotLegends -> (everythingElse[[2]] - everythingElse[[3]]) - HoldForm[Log2[E]]] 

enter image description here

Which means that

$$ \text{entropy}(r) \approx \log_2{r!} - \log_2(e) + 1 $$

entropyApprox[r_]:= Log2[r!] - Log2[E] +1 

Notice (for what it's worth) we can also define entropy recursively so the sum only runs from (r-1)!+1 to r! instead of starting the sum at 1 each time:

ClearAll[entropyRecursive] entropyRecursive[1] = 0; entropyRecursive[r_] := entropyRecursive[r] = 1/Sqrt[r] (entropyRecursive[r - 1] - 1/2 Log2[(r - 1)!]) - 1/Sqrt[r!] Sum[((-Sqrt[-1 + k] + Sqrt[k]) Log[-Sqrt[-1 + k] + Sqrt[ k]])/Log[2], {k, (r - 1)! + 1, r!}] + 1/2 Log2[r!] 

(small add-on note: entropyRecursive stores exact values, which can get huge in terms of LeafCount. If you want approximate values change the initial term to entropyRecursive[1] = 0.. You can also replace Sum with NSum to get really fast computation but you will need to Check each output or something similar as eventually NSum will run into trouble.)

And now we can compare entropy, entropyAsymptotic and entropyApprox a little quicker (though it is still slow)

compare = DiscretePlot[{entropyRecursive[r], entropyAsymptotic[r], entropyApprox[r]}, {r, 9}, Joined -> True, Filling -> None, Frame -> True, FrameLabel -> {"r", "entropy(r)"}, PlotLegends -> "Expressions"] compareError = DiscretePlot[{entropyAsymptotic[r] - entropyRecursive[r], entropyApprox[r] - entropyRecursive[r]}, {r, 9}, Joined -> True, Filling -> None, Frame -> True, FrameLabel -> {"r", "error"}, PlotLegends -> "Expressions", PlotLabel -> "Error"] 

enter image description here

enter image description here

$\endgroup$
7
  • 2
    $\begingroup$ (+1) Great answer like this again! $\endgroup$ Commented Sep 23, 2024 at 3:42
  • $\begingroup$ @A.Kato thank you $\endgroup$ Commented Sep 23, 2024 at 5:28
  • 1
    $\begingroup$ Maybe a simpler path to the same result: replace the sum by an integral. Then justify that the difference is o(1) as r->infinity. (Roughly: denominator of each term ->0 while numerators change more slowly in r.) $\endgroup$ Commented Sep 23, 2024 at 16:40
  • $\begingroup$ @DanielLichtblau Ah yes, this does work! Especially if you IntegrateChangeVariables with t==Sqrt[-1+k] . I can add it to the answer if you want, but if you want to post it as your own answer that would be cool (since it was your idea). $\endgroup$ Commented Sep 23, 2024 at 17:05
  • $\begingroup$ I'll post it in a bit. Different methods can be useful to have in answers. $\endgroup$ Commented Sep 23, 2024 at 17:36
2
$\begingroup$

After noting that the $i$th element of Dist[r] is $\frac{\sqrt{i-1}-\sqrt{i}}{\sqrt{r!}}$, entropy simplifies to \begin{equation*} entropy(r) = \frac{1}{\sqrt{r!}\ln 2}\sum_{i=1}^{r!} (\sqrt{i-1}-\sqrt{i})(\ln(\sqrt{i}-\sqrt{i-1})-\ln\sqrt{r!}). \end{equation*} Separating into two sums we get \begin{equation*} entropy(r) = \frac{1}{\sqrt{r!}\ln 2}\sum_{i=1}^{r!} (\sqrt{i-1}-\sqrt{i})\ln(\sqrt{i}-\sqrt{i-1}) - \frac{\ln\sqrt{r!}}{\sqrt{r!}\ln 2}\sum_{i=1}^{r!} (\sqrt{i-1}-\sqrt{i}). \end{equation*} The second sum above telescopes, giving \begin{align*} entropy(r) &= \frac{1}{\sqrt{r!}\ln 2}\sum_{i=1}^{r!} (\sqrt{i-1}-\sqrt{i})\ln(\sqrt{i}-\sqrt{i-1}) - \frac{\ln\sqrt{r!}}{\sqrt{r!}\ln 2}(-\sqrt{r!}) \\ &=\frac{1}{2}\log_2(r!)-\frac{1}{\sqrt{r!}\ln 2}\sum_{i=1}^{r!} (\sqrt{i}-\sqrt{i-i})\ln(\sqrt{i}-\sqrt{i-1}). \end{align*} That's about as far as I can take this, but maybe you can figure out the asymptotic behavior of the first sum.

$\endgroup$
1
$\begingroup$

Here is a short approach to obtsaining the main terms shown in the (more complete) response by @ydd.

The thing to notice is that if we let rsrtq stand for Sqrt[r!], then the computation can be reduced to the following summation.

Sum[-(Sqrt[i] - Sqrt[i - 1])/ rsqrt*(Log2[Sqrt[i] - Sqrt[i - 1]] - Log2[rsqrt]), {i, 1, rsqrt^2}, Assumptions -> rsqrt > 1] 

Quick check (compare values to original post):

Table[ With[{rsqrt = Sqrt[r!]}, NSum[-(Sqrt[i] - Sqrt[i - 1])/ rsqrt*(Log2[Sqrt[i] - Sqrt[i - 1]] - Log2[rsqrt]), {i, 1., rsqrt^2}]], {r, 1, 10}] (* Out[91]= {0., 0.872429, 2.32478, 4.23358, 6.50504, 9.06583, 11.8628, \ 14.8587, 18.0272, 21.3486} *) 

We can get a reasonable approximation by changing from a sum to an integral and taking a series expansion at infinity to get the dominant terms. This requires justification that the error in this approximation also gets small, and more so than the dominant terms.

Now for that integral.

ii = Integrate[-(Sqrt[i] - Sqrt[i - 1])/ rsqrt*(Log2[Sqrt[i] - Sqrt[i - 1]] - Log2[rsqrt]), {i, 1, rsqrt^2}, Assumptions -> rsqrt > 1000] (* Out[96]= -1/(9 rsqrt Log[2]) 2 (-2 + 3 rsqrt + 2 Sqrt[-1 + rsqrt^2] + rsqrt^2 Sqrt[-1 + rsqrt^2] + 3 Log[rsqrt] + 3 (-1 + rsqrt^2)^(3/2) Log[rsqrt (rsqrt + Sqrt[-1 + rsqrt^2])] + rsqrt^3 (-1 + 3 Log[1 - Sqrt[-1 + rsqrt^2]/rsqrt])) *) 

Let's get the asymptotics.

nn = Assuming[rsqrt > 1000, Expand[FullSimplify[Normal[Series[ii, {rsqrt, Infinity, 2}]]]]] (* Out[97]= -(1/(4 rsqrt^2)) - 1/Log[2] + 4/(9 rsqrt Log[2]) + Log[512]/( 9 Log[2]) + (2 Log[rsqrt])/Log[2] - Log[rsqrt]/(2 rsqrt^2 Log[2]) - ( 2 Log[rsqrt])/(3 rsqrt Log[2]) *) 

Now to trim back and get just the dominant terms. This will reporduce the desired asymptotic result.

nn1 = Expand[FullSimplify[Normal[Series[nn, {rsqrt, Infinity, 0}]]]] (* Out[102]= 1 - 1/Log[2] + (2 Log[rsqrt])/Log[2] *) 

To partly justify the integral approximation we proceed as follows.

First it is not hard to see that for a given value of r, summands get smaller (at least for r larger than some modest value; I have not investigated this too closely). So the error in this approximation is bounded by the difference between left and right Riemann sums. This is a telescoping sum and in turn is bounded by the first summand: it is the first - the n+1th for n=r!, and both are positive, hence bounded by the first.

So now we need to see what this first summand does as r->infinity. This is simple because it reduces to Log2[rsqrt]/rsqrt. So our error in approximation drops at this asymptotic rate. This suffices to justify the leading terms of the asymptotic value for the function at hand. If one looks closely, this also shows that the integral approximation does not give any more correct terms, since the next term in the series expansion is also O(log(sqrt(r!))/sqrt(r!)).

$\endgroup$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.