alftel
Member level 1
Hello my brothers in arms!
I do have a specific issue with RF amplifier that I developed for one of my users - it is a VLW (very low frequency) power amp that supposed to take as an input signal not exceeding 0dBm, otherwise output signal is distorted/saturated. User is going to use a variety of SDRs and other gadgets to supply input stimulus and has no control over output power from these devices. My initial idea was to a) tune pre-amp stage to accommodate presumably known signal level which in turn will require a "unique" atteniation value per RF source in question, or b) employ attenuator to drive input signal down. The problem is that input stimulus signal is not exactly known, and user is not knowledgeable enough in order to measure exact power level(s) from his RF gear. The only fact that he knows is that signal can be anywhere between 0dBm and all a way up to 30dBm. I searched carefully all commercially available offerings (like Minicircuits etc.) but couldn't find anything suitable. Another approach taken was to make power limiter out of two pairs of zener and silicon diodes, as per article here: https://www.maximintegrated.com/en/design/technical-documents/app-notes/4/4035.html (picture/figure #8) - it works in terms of protecting my pre-amp stage from over-voltage case(s), but signal is of course distorted in case input stimulus exceeds certain Vpp value. Is there any way to design lumped components circuit that will limit input signal up-to 30dBm to 0dBm without distortion? Any ideas? I am out of options, tried everything. Attenuators will be "straight between the eyes" solution, but then again, not practical in this particular case due to the fact that user will use different SDRs with a different TX power level(s). Any help, advice, reference will be greatly appreciated.
Cheers,
Alex
I do have a specific issue with RF amplifier that I developed for one of my users - it is a VLW (very low frequency) power amp that supposed to take as an input signal not exceeding 0dBm, otherwise output signal is distorted/saturated. User is going to use a variety of SDRs and other gadgets to supply input stimulus and has no control over output power from these devices. My initial idea was to a) tune pre-amp stage to accommodate presumably known signal level which in turn will require a "unique" atteniation value per RF source in question, or b) employ attenuator to drive input signal down. The problem is that input stimulus signal is not exactly known, and user is not knowledgeable enough in order to measure exact power level(s) from his RF gear. The only fact that he knows is that signal can be anywhere between 0dBm and all a way up to 30dBm. I searched carefully all commercially available offerings (like Minicircuits etc.) but couldn't find anything suitable. Another approach taken was to make power limiter out of two pairs of zener and silicon diodes, as per article here: https://www.maximintegrated.com/en/design/technical-documents/app-notes/4/4035.html (picture/figure #8) - it works in terms of protecting my pre-amp stage from over-voltage case(s), but signal is of course distorted in case input stimulus exceeds certain Vpp value. Is there any way to design lumped components circuit that will limit input signal up-to 30dBm to 0dBm without distortion? Any ideas? I am out of options, tried everything. Attenuators will be "straight between the eyes" solution, but then again, not practical in this particular case due to the fact that user will use different SDRs with a different TX power level(s). Any help, advice, reference will be greatly appreciated.
Cheers,
Alex