It depends on the sampling rate and the bandwidth. As per the nyquist criterion, you need to have the sampling rate to be at least twice the frequency of the signal.
So a 2GS/s scope can correctly measure 1GHz signal (sine wave).
Let me state that an oscilloscope is a "time-domain" device. So the term "Bandwidth" is totally out of context. The max freq. is generally printed on the scope itself. 100MHz scope can measure anyhing upto 100MHz precisely.
The exact implemented sampling freq is mentioned in the scope manual. also mentioned is the precise error involved at a given temp.
It depends on the sampling rate and the bandwidth. As per the nyquist criterion, you need to have the sampling rate to be at least twice the frequency of the signal.
So a 2GS/s scope can correctly measure 1GHz signal (sine wave).
That is what i am not sure about. Many digital scopes just samples signal and when constructing just connect lines on screen from one sample to another.
To correctly display signal of half frequency of input bandwidth scope should digitise signal in quadrature by 2 ADC's .