Conclusion
An approximation that appears to either approach constant or at least sub-logarithmic error for sufficiently large $k$ is:
$$ \begin{align} \mathrm{entropy}(n) & \approx -f(k) \log _2\left(\frac{f(k)}{2}\right) \sum _i^{\frac{k}{2}} \phi (2 i-1) + C\\ \mathrm{with}\\\\ f(k) & = \frac{\pi ^2 k}{2(k-3) (k-1)^2} \\ C & \approx \log _2\left(\frac{\pi }{6}\right) \end{align} $$ Where $\phi(i)$ is Euler's totient function. The value of $C$ is just an observational guess, I really wouldn't put a lot of confidence in its exact value.
This arises from finding the average values of the Differences of S1 and S2 as a function of k, and observing that Length@F[k] == 1 + Sum[EulerPhi[i], {i, k}]
We can also use the asymptotic growth of $\sum_i^k \phi(i)$ to further simplify: $$ \begin{align} \sum_i^{k/2} \phi(i) & \approx 3 \frac{k^2}{4\pi^2} - 1\\ \frac{\sum _i^{\frac{k}{2}} \phi (2 i-1)}{\sum _i^{\frac{k}{2}} \phi (i)} & \approx \frac{8}{3} \\ \sum _i^{\frac{k}{2}} \phi (2 i-1) & \approx \frac{2 k^2}{\pi ^2}-\frac{8}{3} \end{align} $$ to get a simpler approximation:
$$ \mathrm{entropy}(k) \approx -\left(\frac{2 k^2}{\pi ^2}-\frac{8}{3}\right) f(k) \log _2\left(\frac{f(k)}{2}\right)+C $$
And using $f \approx \frac{\pi ^2}{2 k^2}$ (See @Arbuja 's comment below):
$$ \mathrm{entropy}(k) \approx \left(-1 + \frac{4 \pi ^2}{3 k^2}\right) \log _2\left(\frac{\pi ^2}{4 k^2}\right) + C $$
Which can be further simplified to see the asymptotic limit:
$$ \begin{align} \mathrm{entropy}(k) &\approx \left(-1 + \frac{4 \pi ^2}{3 k^2}\right) \log _2\left(\frac{\pi ^2}{4 k^2}\right) + C \\\\ \mathrm{entropy}(k) &\approx \left(\frac{4 \pi ^2}{3 k^2}-1\right) \log _2\left(\left(\frac{\pi }{2 k}\right)^2\right) +C\\\\ \mathrm{entropy}(k) &\approx 2 \left(\frac{4 \pi ^2}{3 k^2}-1\right) \left(-\log _2(k)-1+\log _2(\pi )\right)+C \\\\ \mathrm{entropy}(k) &\approx 2 \log _2(k) + \frac{8 \pi ^2}{3 k^2} \left(-\log _2(k)-1+\log _2(\pi )\right) + C+2-2 \log _2(\pi )\\\\ \mathrm{entropy}(k) &\approx 2 \log _2(k) + K \\\\ K &= C+2-2 \log _2(\pi ) = 1 - \log_2(3\pi) \end{align} $$
Here we see the comparison plot of the approximation and the true entropy (and using @azerbajdzan 's newentropy for faster comparison with $RecursionLimit = 5000:
f[k_] := (k \[Pi]^2)/(2 (k - 3) (k - 1)^2) entropyApprox[ k_] := -f[k] (Log[2, f[k]/2]) Sum[EulerPhi[2 i - 1], {i, k/2}] + Log2[Pi/6] entropyVeryApprox[ k_] := -f[k] (Log[2, f[k]/2]) (-(8/3) + (2 k^2)/\[Pi]^2) + Log2[Pi/6]; testValuesK = Join[Range[2000], Range[3000, 5000, 500]]; plot = DiscretePlot[{newentropy[k], entropyApprox[k]}, {k, testValuesK}, Frame -> True, FrameLabel -> {"k", "entropy(k)"}, LabelStyle -> Directive[Bold, Medium], Filling -> None, Joined -> True, PlotLegends -> {"entropy(k)", "entropyApprox(k)"}]

And we see the error is either approaching 0, or is growing extremely slowly (note the $k$ step size is changed from 1 to 500 past $k = 2000$, due to how long newentropy takes to calculate for large k) :
errsAtK = ({#, newentropy[#] - entropyApprox[#]}) & /@ testValuesK; ListLinePlot[errsAtK, PlotRange -> All, Frame -> True, FrameLabel -> {"k", "error"}, LabelStyle -> Directive[Bold, Medium]]

Zooming in on the error just for $k ≥ 1000$, we see the error appears to be settling around 0 (notice it goes up and then back down towards the end):

So although $C$ might not be exactly correct, the error does appear to be staying somewhat constant and not growing.
Also note the approximation using the limit of the sum of $\phi(i)$ tends towards the other approximation:
DiscretePlot[{entropyApprox[k] - entropyVeryApprox[k]}, {k, 5000}]

Walkthrough
Note: all Analysis is for even $k ≥ 4$, this made things easier. This is fine since if you zoom out enough on entropy[k] it appears smooth.
Means of the differences of S1[k],S2[k]
Notice that the minimum value of Differences@S2[k] is
$$ \frac{1}{4 \left(\frac{k}{2}-1\right)^2-1} $$
min[k_] := 1/(4*(k/2 - 1)^2 - 1) And @@ Table[Min@Differences@Sort@S2[k] == min[k], {k, 4, 100, 2}] (*True*)
Next notice that the mean of the differences of S2[k] divided by min[k] approaches a constant, and it appears to be exactly $\frac{\pi^2}{2}$:
Table[Mean@Differences@Sort@S2[2^k]/(min[2^k]*Pi^2/2), {k, 2, 12}] // N (*{0.202642, 0.545576, 0.806434, 0.855284, 0.943945, 0.966921, 0.988094, 0.990977, 0.996279, 0.997858, 0.999265}*)
So the mean of the differences of S2[k] for large k is
$$ \frac{\pi^2}{2} * \frac{1}{4 \left(\frac{k}{2}-1\right)^2-1} $$
Also observe that for large $k$, the mean of $S1[k]$ is twice the mean of $S2[k]$:
Table[Mean@Differences@Sort@S1[2^k]/ Mean@Differences@Sort@S2[2^k], {k, 2, 12}] // N (*{0.75, 1.21875, 1.42917, 1.81534, 1.84152, 1.9333, 1.95468, 1.98692, 1.99061, 1.99604, 1.99716}*)
So the mean of the differences of S1[k] for large k is
$$ \frac{\pi^2}{4 \left(\frac{k}{2}-1\right)^2-1} $$
min[k_] := 1/(4*(k/2 - 1)^2 - 1) meanDiffS1[k_] := Pi^2 * min[k] meanDiffS2[k_] := meanDiffS1[k]/2
Lengths of S1[k], S2[k]
It can be shown that the length of S1[k] is
$$ \mathrm{length ~of~ S1[k]} = \sum _i^{\frac{k}{2}} \phi (2 i) $$
While the length of S2[k] is the same sum but over the odds plus 1: $$ \mathrm{length ~of~ S2[k]} = 1 + \sum _i^{\frac{k}{2}} \phi (2 i -1) $$
Table[Length@S1[k] - Sum[EulerPhi[2 i], {i, k/2}], {k, 4, 100, 2}] // DeleteDuplicates (*{0}*) Table[Length@S2[k] - (1 + Sum[EulerPhi[2 i - 1], {i, k/2}]), {k, 4, 100, 2}] (*{0}*)
And the Differences of S1 and S2 will just have this length minus 1:
lenDiffS1[k_] := Sum[EulerPhi[2 i], {i, k/2}] - 1 lenDiffS2[k_] := Sum[EulerPhi[2 i - 1], {i, k/2}]
Total of P1[k]
Now observe the Total of Diffferences@S1[k] is:
$$ \frac{k-2}{k} $$
totS1[k_] := (k - 2)/k Table[Total@Differences@S1[k] - totS1[k], {k, 4, 100, 2}] // DeleteDuplicates (*{0}*)
While the total of S2[k] is always 1:
Table[Total@Differences@S2[k] - 1, {k, 4, 100, 2}] // DeleteDuplicates (*{0}*)
meaning that the total of P1[k] is $ 1 + \frac{k-2}{k}$
totS1[k_] := (k - 2)/k totP[k_] := 1 + totS1[k] Table[Total@P1[k] - totP[k], {k, 4, 100, 2}] // DeleteDuplicates (*{0}*)
Adding together entropy contributions of S1[k] and S2[k]
Since we know the mean of the differences of S1 and S2 and their respective lengths, and the total of P1, we can calculate mean contributions (very roughly, I would much prefer to find the mean of each elements entropy -U1[k]*Log2[U1[k] but I couldn't find a regular behavior to this) of S1 and S2 to the entropy:
contribS1[k_] := lenDiffS1[k]*(-(meanDiffS1[k]/totP[k])*Log2[meanDiffS1[k]/totP[k]]) contribS2[k_] := lenDiffS2[k]*(-(meanDiffS2[k]/totP[k])*Log2[meanDiffS2[k]/totP[k]]) totEntropy[k_] := contribS2[k] + contribS2[k]
And simplifying:
Simplify[totEntropy1[k], k \[Element] PositiveIntegers] (*-1/2*(k*Pi^2*Log[(k*Pi^2)/(4*(-3 + k)*(-1 + k)^2)]*Sum[EulerPhi[-1 + 2*i], {i, k/2}])/((-3 + k)*(-1 + k)^2*Log[2])*)
gives us:
$$ \mathrm{entropy}(k) \approx -\frac{\pi ^2 k \log \left(\frac{\pi ^2 k}{4 (k-3) (k-1)^2}\right) \sum _i^{\frac{k}{2}} \phi (2 i-1)}{2 (k-3) (k-1)^2 \log (2)} $$
Which appears to be off by a constant value that might be around $\log_2{\frac{\pi}{6}}$, although this is really just a guess
Ways to Improve
Describing the Distribution of $S_i[k]$
Note that using the means of the differences of S1 and S2 is kind of a sin, because in the end it gets transformed by -U1[k] *Log2[U1[k]. The distributions of S1[k] and S2[k] are also not tightly distributed about the mean:
frameLabels = {"\!\(\*FractionBox[\(differences\), \(mean\)]\)", "prob."}; s1Hist = Histogram[Differences@S1[2^12]/meanDiffS1[2^12], "FreedmanDiaconis", "Probability", Frame -> True, FrameLabel -> frameLabels, LabelStyle -> Directive[Bold, Medium]] s2Hist = Histogram[Differences@S2[2^12]/meanDiffS2[2^12], "FreedmanDiaconis", "Probability", Frame -> True, FrameLabel -> frameLabels, LabelStyle -> Directive[Bold, Medium]]


The shape of the distributions of S1 and S2 appear identical, and for sufficiently large k the shape appears to approach whatever this distribution is. If someone can describe this distribution better than just using the mean, this could improve estimates on entropy[k]
Describing the mean of e[k] := -U1[k]* Log2[U1[k]]
An even better improvement would be for describing the mean of each individual entropy in the list -U1[k] * Log2[U1[k]]. This would exactly describe entropy[k] since it's just the sum of the individual entropies.
I originally hoped to do this, but I could find no regular pattern in how the mean of -U1[k] * Log[k] behaves:
e[k_] := e[k] = -U1[k]*Log2[U1[k]] Table[Mean[e[2^k]]/(-min[2^k]*Log2[min[2^k]]), {k, 2, 13}] // N (*{0.873814, 1.37538, 1.95718, 2.26098, 2.5353, 2.6853, 2.79824, \ 2.8629, 2.91559, 2.9536, 2.98467, 3.00889}*)
I could guess that Mean[e[k] looks something like -x * min[k]*Log2[x*min[k]] or -x * min[k]*Log2[min[k]] but the ratio isn't really slowing down enough for me to confidently say that Mean[e[k] is related to the entropy of min[k] like this.
If someone can find an approximate relationship for the mean of the entropy: $$ \mathrm{Mean(e(k))} \approx g(k) $$
Then we can probably get a much better approximation since we know the length of e[k] to be
$$ \mathrm{Length~of~e(k)} = \sum _i^{k} \phi (i)-1 $$ so $$ \mathrm{entropy}(k) \approx g(k) \sum _i^{k} \phi (i)-1 $$
And using the asymptotic limit of $\sum _i^{k} \phi (i)$:
$$ \mathrm{entropy}(k) \approx g(k) (3 \frac{k^2}{\pi^2} - 2) $$
It may be helpful to note that e[k] also has a distribution which has a constant shape for large k, but I don't know how to describe this distribution either:
eHist = Histogram[e[2^12], "FreedmanDiaconis", "Probability", Frame -> True, FrameLabel -> {"individual entropy values", "prob."}, LabelStyle -> Directive[Bold, Medium]]

Additional comment for further exploration
Notice that S1 and S2 can be defined recursively by finding the coprimes of k in Range[k]:
coprimes[k_] := Select[Range@k, CoprimeQ[#, k] &] newS1Elems[k_] := newS1Elems[k] = If[EvenQ[k], coprimes[k]/k, {}] newS2Elems[k_] := newS2Elems[k] = If[OddQ[k], coprimes[k]/k, {}]
And then initialize and S1 and S2 and Join the new elements:
S1New[1] = {}; S1New[k_] := S1New[k] = Join[S1New[k - 1], newS1Elems[k]] // Sort S2New[1] = {0, 1}; S2New[k_] := S2New[k] = Join[S2New[k - 1], newS2Elems[k]] // Sort
I don't think this will speed up calculating entropy, but I'm wondering if thinking about it this way will bring any new insights.
F[r_]:= F[r] = Range[0, r]/rWe can show that for evenk,entropyEVEN[k_] := -((2^-IntegerExponent[k, 2] k Log[4] + k Log[2^(-2 + IntegerExponent[k, 2])] - 2 (-1 + k) Log[-1 + k])/( 2 (-1 + k) Log[2]))I am trying to find a way to extend this to whenF[r_] := F[r] = DeleteDuplicates[Flatten[Table[Range[0, t]/t, {t, r-1, r}]]],F[r_]:= ..., {t, r-2, r}]]], ... ,F[r_]:= ..., {t, 1, r}]]]but it's quite difficult right now $\endgroup$entropy[k]. (1)U1[k]is a partition of the interval [0,1]. Therefore,entropy[k]bounded from above by the valueLog[2,Length[U1[k]]], which is realized by dividing [0,1] evenly. (2) Using some knowledge of number theory, one can show thatU1[k]is equal toSum[EulerPhi[j],{j,2,k}]. This is also commented in OEIS. (3)Sum[EulerPhi[j],{j,1,k}]is known as Totient Summatory Function, and its asymptotic series is known. $\endgroup$