I was given Matlab code to process a signal file. I do not have any details as to the ADC specs or the code's author. The variable, cxSamps (complex double), contains the signal's IQ data read from the entire file.
I see this line of code:
cxSamps = cxSamps / mean( abs(cxSamps ) ); The comment says that this boosting of the cxSamps is to estimate a coarse AGC and that it is applied to the entire cxSamps. It reduces the dynamic range in order to help distinguish energy bursts from the noise.
Can you please explain:
- how boosting the signal reduces the dynamic range?
- how this improves identifying bursts in the noise?
Before:
mean( abs(cxSamps ) ) = 0.0052 min( abs(cxSamps) ) = 1.056e-06 max( abs(cxSamps) ) = 0.0343 20*log10( 0.0343 / 1.056e-06 ) = 90.231 After:
mean( abs(cxSamps ) ) = 1 (no surprise) min( abs(cxSamps) ) = 2.03e-04 max( abs(cxSamps) ) = 6.5877 20*log10( 6.5877/ 2.03e-04) = 90.226 I reviewed this: Ref: Compute dynamic range in a linear quantization system
EDIT
I believe the above Before and After calculations now have no bearing on my question since I now realize that the dynamic range is just a function of the ADC number of bits.