Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.
well in simple terms , let us take an example of a DC motor having the ratings of "12v" voltage and "500 mA" current.
So you need that amount of voltage and current for satisfactory operation of your motor , thats why they are called RATED VOLTAGE and RATED CURRENT.
Now if you have a supply with the rated voltage i.e. 12V but its output current is merely 20mA , your motor wont work , so in this case you need a CURRENT DRIVER to improve the available current.
Vice-versa is the case for VOLTAGE DRIVER , its used to bring the voltage level to rated value.
I assume the question is about Current-Mode Driver and Voltage Mode Driver.Assuming a channel terminated both on source and receiver side, Voltage Mode Drivers will require less power compared to a Current Mode Driver given the same supply . This is because only half of the current is driven into the receiver side for a Current Mode Driver. Please let me know if this was the question
Actually i'm curious about the time constant caused by these different drivers. For example:
This is the circuit modeling of a transceiver (Tx-Ch-Rx) using current driver (actually it's founded in a paper that i'd surveyed)
The extended question is
1. why the circuit modeling is excluding the resistance of transmitter (Rtx)?
2. i'm still confused about the power consumption that you'd mentioned.
Thanks!
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.