Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

How to define IO driven strength based on the current

Status
Not open for further replies.

zhangljz

Member level 5
Member level 5
Joined
Oct 19, 2013
Messages
81
Helped
1
Reputation
2
Reaction score
1
Trophy points
8
Visit site
Activity points
648
I have an IO library, and I simulated a 24mA IO buffer for test. The load is a 10pf capacitor, and the input is a clock.

From the simulation, the current through the capacitor is about 100mA, but the IO buffer driven strength is only 24mA. Why the difference is so big. From the document, it said the max output frequency is 133MHz, but from simulation when I input a 1GHz clock, it still almost can achieve 0~3.3. Is this case normal?

Thank you!
 

Drive strength is at some DC test condition, like 10% of
rail. Your peak current is found elsewhere, where the output
is more near midrail or higher. How the two relate has also
to do with the output slew rate and internal gate drive, a
slew rate limited buffer with exactly the same DC drive
strength will give you a very different peak value when
the load capacitance is small.

Max output frequency rating is the minimum value of the
frequency that will give good (and this itself is subject
to some judgment-call) swing at worst, worst, worst
P,V,T. If it's "almost good enough" in a nominal sim
you should expect 2-5X worse, somewhere (hot, slow,
low voltage is a good place to start). And remember
that internal supply - ground voltage droops below pin
voltage difference just when you need it to hang
tough for timing, so pad it some more.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top