Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Transistor based amplifier interface to ADC

Status
Not open for further replies.

Milruwan

Member level 1
Joined
Jan 20, 2013
Messages
35
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,546
I designed a transistor based amp to amplify a FM stereo multiplex signal. The transistor amp was correctly biased and could amplify the signal from 500mV to 4.5V. I observe this amplification using oscilloscope. Then I was hoping to do digitally DE-multiplex the signal using DSP. After several trials I found that some thing is wrong and try to convert the sampled signal back to analog using DAC. But I found that the observed signal is distorted and only had a peak of 1V.

1) Is this happening due to low output impedance of the amplifier?
2) How can I resolve this?
3) What are the high output impedance configurations?

amp.jpg
 
Last edited:

The output impedance is (nearly) identical to the collector resistance (4.7k).
For a lower output resistor use another transistor in common collector configuration (emitter follower) with a gain of unity.
 
The transistor amp was correctly biased...
In the amplifier circuit you showed, the resistor values are wrong, the transistor is not correctly biased - it is saturated, and there is no output coupling capacitor or anything to shift the DC level at the output to a suitable level for a DAC input.

...and could amplify the signal from 500mV to 4.5V
Are those rms values or pk-pk or what?

...try to convert the sampled signal back to analog using DAC. But I found that the observed signal is distorted and only had a peak of 1V.
The DAC input was probably overloaded, either because the input signal amplitude was too high or the DC bias was wrong.

1) Is this happening due to low output impedance of the amplifier?
No. In fact the output impedance is quite high.
 
Firstly thanks LvW and godfreyl for your replies.

In the amplifier circuit you showed, the resistor values are wrong, the transistor is not correctly biased - it is saturated, and there is no output coupling capacitor or anything to shift the DC level at the output to a suitable level for a DAC input.

Sir, I admit that the schematic in proteus is wrong. But at the laboratory I used R1 and R3(normally Rc) as variable resistors in order to bias the transistor. Today I do the practical again at laboratory I found at the correctly biased point R3(normally Rc) value is approximately 1.45k ohm. So I think this this must be law for the output resistance.
Am I correct ? In order to interface the signal to a DAC what should be the minimum Rc value?
My next step is to use another transistor in common collector configuration (emitter follower) with a gain of unity.
 

Before you start to design the amplifier, you need to know a few things. For example:

1) What is the allowed input voltage range of the DAC. e.g. it might be 0V to +3V, or it might be something else.
2) What range of input signal voltages do you want to cater for? You said "500mV", but you didn't say if that's rms or peak-to-peak or what. You also didn't mention if that's the maximum input voltage or the minimum.
3) What is the frequency range of the input signal you want to amplify?
4) What is the source impedance of the input signal? Alternatively, what is the input impedance of the amplifier supposed to be?
5) Does the DAC need to be driven from a low source impedance? If so, how low?
6) What's the input impedance of the DAC?

*) Hopefully you've already sorted this out, but just to be sure:
What is the sampling rate of your DAC? In one of your other threads a few people (including you) seemed to have some very weird ideas about the required sampling rate.


But at the laboratory I used R1 and R3(normally Rc) as variable resistors in order to bias the transistor.
That's just messed up. Are you seriously trying to design the circuit by trial and error?

Today I do the practical again at laboratory I found at the correctly biased point R3(normally Rc) value is approximately 1.45k ohm. So I think this this must be law for the output resistance.
Am I correct ?
No, that just means that after you finished fiddling with the variable resistors, one of them measured 1.45k.

btw, Did you notice that the voltage gain was less than 3 when you finished fiddling this time (assuming R4 was still 500 Ohms)?

According to post 1, you got a voltage gain of 9 last time you fiddled with it.

Don't you think it would be a better idea to first work out what voltage gain you actually need, then choose resistor values that give the required gain?
 
I simulated your transistor with a 100k load. Your biasing is wrong so it causes severe clipping because the transistor is saturated.
I changed one biasing resistor value to bias it correctly. It begins to distort when the input level is 450mV peak which is 318mV RMS.
The output level is high enough to destroy an ADC.
 

Attachments

  • Sim of transistor.png
    Sim of transistor.png
    50.3 KB · Views: 78
Thanks for the all of your replies..
This time I haven't do any amplification but I used emitter follower configuration,As shown below.
emitter follower.jpg
simulation is shown below.
sim.jpg
Original signal directly observed from oscilloscope.(2.45V vp-p)
ori_sig.jpg
Original signal directly fed to ADC and observed from oscilloscope.
ori_sig_adc.jpg
But when I connect the signal generator signal to emitter follower circuit and feed it to ADC it will give a distorted signal.
The distorted signal observed from oscilloscope
dis_adc_dac.jpg
this output is a DC value of 4.46v .
ADC and DAC schematic is below.
adc_dac.jpg
I am stuck with this situation So please help me.

- - - Updated - - -

I forgot to mention there were to coupling capacitors at the input and the output .Also the sine wave is a 1kHz wave and ADC maximum voltage is 5V.
 
Last edited:

The emitter-follower bias resistors have an an extremely low value of 1k each that is probably shorting the high output impedance of your signal source.
They should be 47k each. What was the value of the missing coupling capacitors?
Please post the schematic of your signal source so we can see why its output impedance is so high.
 
Two coupling capacitors are 1uF caps. Please help me.
 

When I give a 2.4V (Vp-p) signal to emitter follower circuit and and feed the output signal to the ADC this won't give the original signal. I also notice that the DAC outputs a dc signal which amplitude is 4.5V dc signal with noise.
emitter follower1.jpg
 

It sounds like you forgot to apply any DC bias to the ADC input.

I assume it uses a +5V supply. If so, just make a voltage divider by connecting two 10K resistors in series between +5V and ground, and connect the mid-point of the two resistors to the ADC input. That will bias it to +2.5V. Now you can connect the input signal to the ADC input through a capacitor.
 
1uF feeding two 1k bias resistors in parallel produces a cutoff frequency of 320Hz so 1kHz will be attenuated to about 1/3rd of the original level if the signal source has a low output impedance but we don't know what it is.

The input impedance of your emitter-follower with two 47k bias resistors is 21k ohms so 10uF feeding it produces a cutoff frequency of 0.76Hz so it looks like you do not know the simple formula to calculate the value of a coupling capacitor.
 
  • Like
Reactions: Milruwan

    V

    Points: 2
    Helpful Answer Positive Rating

    Milruwan

    Points: 2
    Helpful Answer Positive Rating
AVDD is 5V so the DC output voltage of follower must be 2.5V and it must be DC coupled to AIN (AIN is now floating). Omit C3 and change upper bias resistor to 33k.
 
  • Like
Reactions: Milruwan

    V

    Points: 2
    Helpful Answer Positive Rating

    Milruwan

    Points: 2
    Helpful Answer Positive Rating
Thanks godfreyl, I really forgot the DC bias to the ADC input. As you told I connect two 10K resistors to give the dc bias. but It wouldn't work ,when I serially connect 1k resistors and give the dc bias to it, I could see a similar wave as the original signal but it also consist of noise. What could be the problem?
 

Your biasing resistors couple any hum or hiss from the +5V power supply directly into the emitter-follower.
A 2N3904 transistor can be biased from two series 100k resistors then they can be fed from a supply filter which is a series 1k resistor then a 100uf capacitor to ground.
 
What is the difference between feeding the signal to ADC through a voltage divider and a supply filter ? Is the supply filter method bias the ADC in a proper manner ?which is the method working perfectly?
 

Here is the circuit I had tested.
emfout_cct.jpg
When I give an analog signal I can get the emitter follower output as follows. (Yellow - output , Blue -input)
emfout.jpg
After giving the emitter follower output to the ADC and observe the DAC output from oscilloscope,I can observe a distorted signal as follows(yellow).
emf_dacout1.jpg
How to get the original signal from the ADC?
What are the modifications I need to do?
 

What sort of DAC is it?

A lot of DACs have current output. Their output has to be connected to the input of an IV converter that has a low input impedance.
 
DAC and ADC schematic is as follows.
adc_dac.jpg
I don't think there could be a issue with dac. When I directly give the signal generator output to the ADC ,I could see the original signal from the DAC. Please help me..
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top