analogTechie
Junior Member level 1
Greetings all !
I am trying to run sub-system level (a few blocks) mixed-signal ADMS simulation of an SoC. Here are some of the parameters:
'external' pwr supply, vdd: 5.5V
'internal' pwr supply, vcc : 1.8V
a2d, d2a defined for 1.8V domain
vdd is ramped. The only way I can get the simulation to pass is if I make the vdd ramp time less than 1 nanosecond!! For ramp times longer than 1ns, the SoC pins power up at 0.9V (they should be at 1.8V) and stay there.
Has anyone seen anything like this, or does anyone have any suggestions on what I could try ?
thanks,
jm
I am trying to run sub-system level (a few blocks) mixed-signal ADMS simulation of an SoC. Here are some of the parameters:
'external' pwr supply, vdd: 5.5V
'internal' pwr supply, vcc : 1.8V
a2d, d2a defined for 1.8V domain
vdd is ramped. The only way I can get the simulation to pass is if I make the vdd ramp time less than 1 nanosecond!! For ramp times longer than 1ns, the SoC pins power up at 0.9V (they should be at 1.8V) and stay there.
Has anyone seen anything like this, or does anyone have any suggestions on what I could try ?
thanks,
jm