-
Notifications
You must be signed in to change notification settings - Fork 15
Description
There is a need to understand the behavior of the RASDR devices at the operating points that they are likely to be used. With a broadband noise source of known output power, one can sweep across the RASDR tuning range and gather statistical information of the performance of the receiver across all of its ranges. There should be no outside sources of energy except that which leaks in from the case, or is produced by internal components or the cables, etc. Knowing the bandwidth of the receiver as well as the gain/attenuation settings in the signal path allows one to predict what the response should be and compare that to what the response actually is.
For example, I was surpised by the following observations in which the only variation was sampling clock frequency when looking into an external 50ohm load SMA screwed on to the RX input port:

50ohm load, Fcenter=408.1MHz, 1.5MHz bandwidth, 10Msps

50ohm load, Fcenter=408.1MHz, 1.5MHz bandwidth, 8Msps

50ohm load, Fcenter=408.1MHz, 1.5MHz bandwidth, 4Msps

50ohm load, Fcenter=408.1MHz, 1.5MHz bandwidth, 2Msps
I think that one would make these observations and populate a database (or spreadsheet) with the metrics and then create some kind of a 'heatmap' of the performance changes as different parameters are varied. In a sense, 'visualizing' the behavior of the receiver under some handful of performance metrics. Here are some metrics I am thinking of:
- DC balance: how far away from 'correct' the average value I and Q signal components are. For a calibrated noise source, they should have the same average value, and based on the bandwidth and gains that average value is a known quantity.
- baseline flatness: So what I see with the RASDR, is that even looking into a 50ohm load, the receiver outputs some spurs and not all sample rates produce a flat spectral baseline. This is bad, and represents a pretty strong "quality" metric of the receiver system itself. Knowing where the "bodies are buried" is important to using the receiver effectively.
- SFDR: this one is a bit tricky, as it needs a signal injected in it. But I can do that now. This tests the liklihood of intermodulation effects to interference signals. If we can quantify how the RASDR receiver is affected by signals inside the box as well as what might come through the LNA, we can develop a better filtering plan. I know that the LMS6002D has a pretty bad IP3 spec (-1dBm). Some references: IP3 Demystified, Cascaded 2-tone 3rd-order intercept point.
- sensitivity: this would be the minimum signal level detectable by the device at a particular frequency, and naturally would be a function of the overal noise level in the receiver electronics itself.
- NF/Noise Temperature: Probably should also characterize the device this way, as it is in common use. Some referenecs: On Noise Figures and Noise Temperatures, Three Methods of Noise Figure Measurement, Fundamentals of RF and Microwave Noise Figure Measurements
Each observation would be indexed by the following attributes:
- Device Serial Number
- Date of Observation
- Center Frequency
- LPF Bandwidth
- Sample Rate
- LNA Selected (LNA1, LNA2, LNA3 or 50ohm)
- LNA Gain mode (Max, Mid, or Bypass)
- VGA1 Gain
- VGA2 Gain
- Total received power (sqrt(I_I + Q_Q) in dBm)
- Calibrated noise source power (in dBm)