Nyquist once said that to sample a 100 MHz signal, you need at least a 200 MSPS clock rate. And, assuming that you could probably pass a 250 MHz signal (albeit with severe attenuation) thru your 100 MHz analog bandwidth input, you would need at least a 500 MSPS ADC to see that signal unmolested.
So, 1 GSPS is not that much overkill.
If you did not have a high sample rate, you would get digital processing artifacts in the displayed waveform, and they would be very confusing to you to figure out what glitches were real, and which ones were phantoms.