jowong1
Junior Member level 1
cadence sigma delta reltol
I am designing a rather high resolution (>15bit) delta sigma ADC. I have a design that's working in MATLAB and I am trying to build the same thing in cadence using Verilog A modeling. Every component I have now is ideal and in verilog A code, so that means no transistors, no resistors and no capacitors. I am using the function laplace_nd to model my loop filter, however, the SNRs I am getting is off from what I got in MATLAB and the max stable input amp is off as well. I have tried the following
1. Tighten reltol and abstol
2. reduce the max time step to 1ps, running under moderate (this is giving me convergence issue)
But they are not solving my problems. Does anyone have any idea on what else I can try?
Thanks
I am designing a rather high resolution (>15bit) delta sigma ADC. I have a design that's working in MATLAB and I am trying to build the same thing in cadence using Verilog A modeling. Every component I have now is ideal and in verilog A code, so that means no transistors, no resistors and no capacitors. I am using the function laplace_nd to model my loop filter, however, the SNRs I am getting is off from what I got in MATLAB and the max stable input amp is off as well. I have tried the following
1. Tighten reltol and abstol
2. reduce the max time step to 1ps, running under moderate (this is giving me convergence issue)
But they are not solving my problems. Does anyone have any idea on what else I can try?
Thanks