Hi, in some laboratory instruments (i.e. Network analyzers) the provide two kind of input impedances : 50Ohm and 1MOhm // 20pF.
I understand the 50 Ohm , is the standard for maximum power transfer but why the 1MOhm parallel with 20pf?
Where is this needed and how it is exactly implemented inside the circuit?
50Ω impedance is not primarily for maximum power transfer, it's to match the transmission line impedance of the connecting cable to avoid signal reflections,
The high impedance input is for where you don't want to significantly load the signal source and don't need to match a transmission line impedance.
The high impedance in an oscilloscope is typically used with a 10:1 probe to further increase the impedance and minimize loading the source.