From what I know, introducing a inductor instead or resistor to a LPF configuration will cause peaking in frequency response.
I ran some simulation an it proof this, if you were saying using inductor is better, why I don't see and ripple reduction in my simulation result?
Here is my netlist
**********************************
*power supply filter
.subckt sup_fil1 in out gnda
L1 in out 1.2u
C1 out gnda 33u
C2 out gnda 2.2u
C3 out gnda 0.1u
.ends
.subckt sup_fil2 in out gnda
R1 in out 100
C1 out gnda 33u
C2 out gnda 2.2u
C3 out gnda 0.1u.ends
.ends
x1 vdd18 vdd18a gnd sup_fil1
*power supply
vdd18 vdd18 v1 dc=1.8 ac=1
*10kHz sinusoidal ripple noise
v1 v1 gnd sin 0 50m 10k 0.0 0.0 0.0
.tran 50n 200u
.ac dec 200 1 100G
.graph par('vdb(vdd18a)-vdb(vdd18)')
.alter
x1 vdd18 vdd18a gnd sup_fil2
.end
**********************************
Attached is the result simulated from HSPICE
-light blue plot is my incoming rippled power supply 1.8
-pink plot is the output voltage with 1.2uH inductor LPF configuration
-darker blue is the output voltage with 100ohm resistive LPF configuration
From the frequency response (bottom graph), I can tell
1. at lower frequencies 100Hz-10MHz, inductive power supply filter will induce more ripple than resistive configuration
2. only at higher frequecies > 10MHz, inductive power supply filter will produce a more stable output voltage, since the attenuatuation is rolling off faster in magnitude.
So, can anyone help me one why people still use inductor as power supply filter? I still see no reason why inductor is better than resistor (I may be wrong, but please CORRECT my understanding)