Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

wired phase change with distance

Status
Not open for further replies.

gary1943

Newbie level 2
Joined
Apr 26, 2012
Messages
2
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,297
Hi, all. Recently i found some weird things with my antenna. My antenna has a frequency range from 850Mhz to 4Ghz. When transmitting signals by using carrier from 850Hz to 2Ghz, I can find the phase rotation of the received signal when decreasing the distance between the transmitting and receiving antennas. However, when doing the same thing by sending signals on 2Ghz to 4Ghz, i did not see dramatical phase rotations. There were just some trivial changes about the amplitude and phase when decreasing the distance. That confused me. Any ideas? Thanks so much.
 

not sure precisely what you are measuring, but in general the phase will increase as the frequency increases, if for no other reason than the transmission line has more phase shift at higher frequencies.

But if the antenna is matched over only a small frequency band, or if there are other mismatches in the system, you can have standing waves set up on the tramission lines, which correspond to significant phase ripple.

also, of course, you could have wireless multipath that would make the transmitted signal phase look non-linear with frequency.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top