Can anyone explain me why increasing the height of the substrate leads to a worse matching in microstrip antenna after reaching some point of height og the substrate?
Can anyone explain me why increasing the height of the substrate leads to a worse matching in microstrip antenna after reaching some point of height og the substrate?
:idea:
There's more to the problem than what you have described. For example, if you change the substrate height without re-designing the antenna, you have detuned it. And also, even if you tune the patch back, it will almost certainly have a different 'tapping point'.
:idea:
For a good answer, you should supply a more detailed question, i think...