Bug #4468

"RSSI offset" default of 0 is not useful

Added by laforge 13 days ago. Updated 12 days ago.

In Progress
Target version:
Start date:
Due date:
% Done:


Spec Reference:


The "rssi offset" value that can be configured via the VTY and command line arguments is initialized to a default of 0.

This does not work out at all, at the very least it's proven to be bogus on the popular USRP B2xx hardware in DCS 1800 (See #4467). I would acually be surprised if it's correct on any hardware at all.

In an ideal world, every GPSDR vendor would ship every unit together with a proper calibration table over frequency, so that a power level as seen in the I/Q samples can be translated into an absolute value in dBm.

However, the world is far from ideal, and the best we can do is try to measure this offset for the few commonly used devices we have available (B200, B210, N200, LimeSDR-mini, LimeSDR USB) and then use that value instead of '0'.

We may need a value per band (particularly on LMS where different bands go thorugh different RF paths), and it will of course also change depending on how the hardware receive gains/LNAs are configured. Also, it will drift over frequency within the band, and it will of course have a spread across different units of a given device type.

However, I guess anything is better than "0" at this point.

osmo-trx-calib.gnumeric osmo-trx-calib.gnumeric 7.96 KB laforge, 03/22/2020 06:35 PM

Related issues

Related to OsmoBTS - Bug #4467: bad voice quality in current omo-bts-trx masterIn Progress03/20/2020

Related to OsmoTRX - Bug #3949: osmo-trx-lms: improve runtime gain setting (missing calibration)New04/23/2019


#1 Updated by laforge 13 days ago

  • Related to Bug #4467: bad voice quality in current omo-bts-trx master added

#2 Updated by laforge 13 days ago

  • Related to Bug #3949: osmo-trx-lms: improve runtime gain setting (missing calibration) added

#3 Updated by laforge 12 days ago

I've done some initial measurements on a B210 at ARFCN 871, using my Racal 6113 BTS tester.

Using the default RxGain value of 38 (half of the maximum value 76), a RSSI offset of 28 renders correct RxLev vlaues.

It needs to be determined how much this is influenced by frequency, gain, and spread across B2xx units.

#4 Updated by laforge 12 days ago

I took some more measurements with two different B210 units and one LimeSDR-USB at different sides of the 900 MHz and 1800 MHz bands.

  • USRP spread between B210 units is < 1 dB, i.e. neglectible
  • rssi-offset for B210 in 1800 MHz should be "rxGain - 11"
  • rssi-offset for B210 in 900 MHz should be "rxGain - 7.5"
  • rssi-offset for LimeSDR-USB in 1800MHz should be "rxGain - 17" (assuming LNAW)
  • rssi-offset for LimeSDR-USB in 900MHz should be "rxGain - 6" (assuming LNAL)

Attaching detailed measurements as gnumeric spreadsheet.

So the best approach would probably be to dynamically adjust the rssi-offset every time the RxGain is being set.

The question is a bit how to do this properly in a way that
  1. we have sane defaults
  2. the user can still add an additinonal offset to express e.g. that he's added an external LNA / RF frontend
  3. we remain backwards-compatible with previous config file / command line argument semantics
So what about:
  • if the existing rssi-offset is given via command line or vty, behavior remains as is
  • we introduce a new vty command rssi-offset mode (absolute|relative)
    • absolute is the old behavior, where the user-provided value is used as-is, irrespective of gain
    • relative is the new behavior, where the user-provided value (default:0) is applied relative to the device+band specific default values (see my maasurements for 900+1800 above)

Also available in: Atom PDF

Add picture from clipboard (Maximum size: 48.8 MB)