randomDensityMatrix[n_] := ConjugateTranspose[#2].(#1 #2) &[ Normalize[RandomReal[{1, 10}, n], Total], RandomVariate[CircularUnitaryMatrixDistribution[n]] ] n = 100; ρ1 = randomDensityMatrix[n]; ρ2 = randomDensityMatrix[n];
Both the following may work:
0.5 Re[Tr[MatrixPower[ConjugateTranspose[#].# &[ρ1 - ρ2], 1/2]]] 0.5 Total[Sqrt[Eigenvalues[ConjugateTranspose[#].# &[ρ1 - ρ2]]]]
But the second version seems to be ten times faster for n = 100.
Edit:
J.M. pointed out in a comment that Total[Abs[Eigenvalues[ρ1 - ρ2]]]/2 might be a better idea because the condition number of ConjugateTranspose[ρ1 - ρ2].(ρ1 - ρ2) might be quite high. I objected that it actually runs slower and guessed that would happen because ρ1 - ρ2 is indefinite. I have to correct this guess: Now I believe that the speed difference might be caused by Mathematica not correctly detectling that ρ1 - ρ2 is Hermitian. The runtimes below somewhat indicate that Apparently, operations like ConjugateTranspose[#].#& and ConjugateTranspose[#] + #&set some internal flag that makes Eigenvalue branch to a special algorithm for Hermitian matrices:
n = 100; RandomSeed[20201229]; ρ1 = randomDensityMatrix[n]; ρ2 = randomDensityMatrix[n]; 0.5 Re[Tr[MatrixPower[ConjugateTranspose[#].# &[ρ1 - ρ2], 1/2]]] // RepeatedTiming 0.5 Total[Sqrt[Eigenvalues[ConjugateTranspose[#].# &[ρ1 - ρ2]]]] // RepeatedTiming Total[Abs[Eigenvalues[ρ1 - ρ2]]]/2 // RepeatedTiming Total[Abs[SingularValueList[ρ1 - ρ2]]]/2 // RepeatedTiming B = ρ1 - ρ2; B = 1/2 (ConjugateTranspose[B] + B); Total[Abs[Eigenvalues[B]]]/2 // RepeatedTiming Total[Abs[SingularValueList[B]]]/2 // RepeatedTiming
{0.013, 0.289995}
{0.00075, 0.289995}
{0.0054, 0.289995}
{0.0014, 0.289995}
{0.00064, 0.289995}
{0.00129, 0.289995}
As you can see, Eigenvalues and SingularValueList operate much faster on the matrix B as on ρ1 - ρ2. So
Total[Abs[Eigenvalues[ConjugateTranspose[#] + # &[ρ1 - ρ2]]]]/4
seems to be the better choice, no matter what the condition number of ρ1 - ρ2 is.