Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

is there problem of using 75 ohm coaxial in rf applications?

Status
Not open for further replies.

danfoss

Junior Member level 3
Joined
Feb 28, 2012
Messages
27
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,480
hi all...
i make dipole antenna for ft-1900 fm transceiver but my question is there a big problem of using 75 ohm coaxial cable for antenna?
thanks...
 

Quick answer:

There will be a little less power output from the transmitter, as it will drive the 75Ω load less than it would a 50Ω load.

Also, there will (possibly) be a mismatch at the antenna, but a dipole might well be closer to 75Ω than 50Ω, so maybe not much. Even if the antenna were a true 50Ω then the VSWR would be 1.5:1, so 14dB return loss.

That's all not too bad; the radio should handle it without a problem (big, high-power, professional transmitters might not though) and with just a little lower output.

You could mitigate some of the above by using a 1/2 wavelength multiple length of coax. Don't forget to take into account the velocity factor of the coax if you try this, and obviously if you are transmitting across multiple bands then it's probably not worthwhile.
 

Cable impedance comes from a common agreement to establish the source-load test equipment on a base everybody can use. Therefore most of RF test equipment is using 50 Ohms.

From transmission-line analysis it was found that 60 Ohms has a lowest loss, and 70 Ohms a highest power-carrying capacity. If you simply have a good 75-Ohm cable to feed an antenna, use it but you would have to adjust the impedance-matching circuits at the antenna and at the transmitter/receiver for an optimum matching.

It is said that a dipole has 70-Ohm real impedance. In real conditions this is not true. There is always a close conductive ground introducing a reactive component, and in antennas, other dipoles or conductors are located around "your" dipole. You then must either measure the real impedance of your antenna and design a matching circuit, or, practically easier is to use an adjustable matching circuit and "tune" your antenna for the best performance. Well adjusted antennas work well, and optimized commercial antenna cost is high due to this optimization work.
 

I guess that all means, if it is a non critical, or low RF power application, you will probably not see a difference. If you are transmitting 1 KW of RF power, I would do the right impedance cable and antenna to match to your transceiver.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top