[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

*Subject*: [linrad] dBc/Hz*From*: Leif Åsbrink <leif.asbrink@xxxxxxxxxxxxxxxxxxxx*Date*: Sat, 6 Dec 2003 16:40:55 +0100

Hi All, Is there anyone who knows the correction factor???? When we measure noise levels on a spectrum analyser we use a logarithmic detector. By means of a video filter we can average the detected voltage to get smooth curves. It may be a standard procedure to take the reading for a carrier and the sideband noise from the spectrum analyser screen, correct for the bandwidth and present the result as dBc/Hz but it is not correct. The definition of dB says power ratio and the result obtained from the averaged logarithmic detector output is not power. I can find empirically that the result obtained from a logarithmic average is about 6 dB lower (better) than the true result from a RMS detector. Obviously it is "simple" to deduce the difference between the RMS and the log detector results under the assumption of white noise (within the passband). Unfortunately I am not clever enough to do the mathematics myself but I think the result should be well known. Is there anyone who knows how many dB (with decimals) one has to add to correct the result from an oldfashioned spectrum analyser to get the true dBc/Hz value for the sideband noise? 73 Leif / SM5BSZLINRADDARNIL