Bug #4984
closed
AMR Rate Adaptation uses BER instead of C/I thresholds
Added by laforge over 3 years ago.
Updated over 1 year ago.
Description
For some non-obvious reason, trx_loop_amr_input()
uses the BER for switching AMR modes, rather than the C/I which is specified in 3GPP TS 45.009. This should be changed to use C/I and be spec compliant. Note that this only applioes to osmo-bts-trx. The other osmo-bts back-ends have the PHY/DSP handle this, and we assume those implementations are correct.
There also is no C/I normalization as per Annex A of the same specification.
- Status changed from New to Feedback
For some non-obvious reason, trx_loop_amr_input() uses the BER for switching AMR modes, rather than the C/I
Most likely because C/I was not available before we introduced TRXDv1.
which is specified in 3GPP TS 45.009. This should be changed to use C/I and be spec compliant.
But as stated in section 3.3.1: "Codec mode adaptation is based on a normalized, one-dimensional measure of the channel quality, called the Quality Indicator." ... "The Quality Indicator may be derived from an estimate of the current carrier to interferer ratio, C/Iest , or an estimate of the current raw bit error rate (BERest).".
So AFAIU, in the current implementation we derive the Quality Indicator from BER. laforge does this make sense to you?
- Status changed from Feedback to In Progress
- Assignee set to fixeria
- % Done changed from 0 to 80
- Related to Bug #5570: coding: decode in-band data in AMR's special DTX frames (SID_FIRST, SID_UPDATE, SID_ONSET) added
- Status changed from In Progress to Stalled
- Status changed from Stalled to Resolved
- % Done changed from 80 to 100
All patches have been merged. One of our customers have successfully tested C/I based link adaptation.
- Has duplicate Bug #1618: AMR adaption loop doesn't use C/I thresholds, only BER added
Also available in: Atom
PDF