Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

[SOLVED] Why phase noise is generally seen at 100 Hz offset in RADAR applications ?

Status
Not open for further replies.

shashy.br

Full Member level 2
Joined
Sep 15, 2011
Messages
130
Helped
6
Reputation
12
Reaction score
6
Trophy points
1,298
Location
India
Activity points
2,151
I need to know why the phase noise of oscillators are generally seen at 100 Hz offset from the carrier ?
is there any reason for the measurement at this frequency offset in RADAR application?:?:
 

Depends what is important in which application. Some time and frequency applications require tight specifications at 1Hz offset.
 
The 100Hz offset is cruisual for a ground based RADAR application since it must must detect stationary objects also in some cases,

hence the doppler shift for a stationary object nearer to a RADAR is in 30 to 100 Hz range , Hence to detect these reflected waves the phase noise at this point of the Local oscillator needs to be very pure.
 

you must be talking about weather radars, where very small doppler speeds (i.e. wind speeds) are being measured. Any sort of moving objects will have a much bigger doppler frequency shift, and then the phase noise at 100 Hz is not very important.
 
Doppler shift is given by:

Doppler = Target Velocity * RADAR frequency / speed of light

For instance, at x-band (10 GHz), 100 Hz doppler is caused by velocity of only 3 m/s (a person walking quickly).
 
Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top