PCSU1000 basic question about spectrum analyser function

Hello Guys,

I try to figure out how the PCSU1000 Compute the levels of the THD when we are running the spectrum analyser function.

When I set the channel 1 with 5mV resolution, and when I loopback the probes I can see a nice -115dBV THD noise.

Well, If I consider that the DAC is a 8bit one, and that the Range of observation is 40mv (8x5mv), this means that we should not be able to go lower than 20xlog (0.04/256) = -76dBV.
But I see that the spectrum analyser propose a range from -40dBV down to -120dBV (when I set it to 5mv resolution)

Can you please explain the way it works - or what I’m missing there ?
It seams to be that the sensor has a 80dBV spectrum , and the DAC only spli into 0.3 dBV resolution.

Thank you for your support.
Cheers gilles.

You are right - if we have an ideal system the levels below 1bit can’t be measured.
But we are lucky - there is always some noise at the input of the ADC of the PCSU1000.
So there is some dither:
en.wikipedia.org/wiki/Analog-to- … ter#Dither
“In ADCs, performance can usually be improved using dither. This is a very small amount of random noise (white noise), which is added to the input before conversion.”

This is why the frequency components down to about -120dBV can be seen.

In this example the Hanning window is used with option Vector Average:

The signal source was PCGU1000 function generator, frequency 1.1kHz.

Here the 1.1kHz input signal is attenuated to -114dBV level.
It can be still seen on the spectrum analyzer screen.