Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Radiation pattern of patch antenna at high frequency!

Status
Not open for further replies.

KhangKhang

Member level 2
Joined
Jan 4, 2007
Messages
48
Helped
1
Reputation
2
Reaction score
0
Trophy points
1,286
Activity points
1,612
We know that the radiation pattern of a patch antenna fed by a coaxial cable is monopole-type pattern.

I already got this kind of trivial example from CST help.
Then, I just scaled it to make the antenna resonate at high frequency, let's say around 50 GHz.

Everything seems quite simple, but the radiation patterns at the resonance frequency has many ripple.
Does anybody know its reason or tell me some theory about this?

Thanks in advance!
 

The reason can be something like:

- The contributions of surface waves diffracted at the edges and corners.
- The highly oscillatory behavior of these diffracted wave contributions when they are at a distance apart from each other. The oscillations increase as the electrical size between the sources increase (which is the case for high frequencies).
 

The typical patch antenna radiates broadside which is not like a monopole. A monopole has a null at broadside (i.e. perpendicular to ground).
 

have you scaled also the dielectric thickness ?
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top