You amplify signals, because your A-to-D converter doesn't have enough resolution to "see" a typical, SMALL incoming signal, say around -90 dBm (which is about 20 microvolts, pk to pk). Most receivers have a conversion gain on the order of 50-100 dB, so that brings your signal up to a range of -40 to +10 dBm (6.3 mVpp to 2.0 Vpp). These voltages are much more "receiver-friendly", in terms of amplitude.
For a single amplification stage, yes... you will retain the same SNR using an ideal amplifier. However, amps are hardly ideal and will induce their own noise, thus degrading the SNR from input to output. This ratio is called the noise figure, or NF, of a device. To improve the SNR of your received signal, you can amplify and mix your signal to a fixed IF, then run it through a hard bandpass filter to drop the noise floor back down to near ambient conditions. That will improve your output SNR, compared to the original incoming signal.