Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

[SOLVED] Why are all reference sensitivity requirement of all bands different in WCDMA (UMTS)

Status
Not open for further replies.

criterion456

Member level 2
Joined
Sep 13, 2015
Messages
50
Helped
6
Reputation
12
Reaction score
6
Trophy points
1,298
Activity points
2,022
UMTS  SEN.png

LoRa-Sensitivity-Formula.jpg

As shown above, the reference sensitivity requirement:

B1: -106.7
B2: -104.7
B3: -103.7
B4: -106.7
B5: -104.7


They are different.



However, from the sensitivity formula,

These parameters, such as bandwidth, SNRmin, are all the same in these bands.

In theory, the reference sensitivity requirement should be identical.


What makes them different?



Thanks a lot~!!
 

The basic noise power density (kTB) of a WCDMA receiver is -108 dBm/3.84 MHz, at 290K (17°C).

The 3GPP spec set various reference sensitivities in different bands function of potential interferers (in-band and out-of-band) which could appear in those bands.
 
The basic noise power density (kTB) of a WCDMA receiver is -108 dBm/3.84 MHz, at 290K (17°C).

The 3GPP spec set various reference sensitivities in different bands function of potential interferers (in-band and out-of-band) which could appear in those bands.


Thanks for your reply.:-o
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top