We know that the displayed average noise level(DANL) and noise figure(NF) of a spectrum analyser is given by the following equation:
\begin{equation} NF_{𝑆𝐴} = 𝐿_{𝐷𝐴𝑁L}[dBm] + 174dBm −10 \cdot log_{10}(R𝐵W/1𝐻𝑧) +2.5 dB \end{equation}
where RBW is a resolution bandwidth 2.5 dB is some constant coming from a detector (anyway it is not clear for me, but it is a detail for now). By the way there is no signal sent to analyzer in the experiment.
So what I see that for some spectrum analyzer the DANL is about -90 dBm for RBW = 2.4 MHz, so I get the value of NF = 22dBm. Then from the NF equation \begin{equation} T = T_0 (10^{NF}-1) = 46500 K \end{equation}
So the question is, where is a catch, is it OK to have such an astronomic values? Could it come from all the electronics of the spectrum analyzer? Another question, what is the origin of the $NF_{SA}$ formula? I can understand that 174 dBm and the term of RBW comes from the noise of a resistor at 300K, but the term of $ L_{DANL}$ is not clear for me, is this just a definition