There are two ways to do this.
If you have an actual receiver, the manufacturer (or you) can set up a bit error rate test (BER). In that you provide a known pattern of modulation bits to the receiver front end, and you lower the input power until a certain BER is acheived, such as -95 dBm at 10^-5 bit errors. Somethimes for low data rates, like a battery powered wireless sensor, the packet error rate is more important, since it more emulates the header recognition, etc, problems of bursty signals.
If you do not have a receiver to test, then you do it your way. You compute a signal to noise ration IN DB, not dBm. Then you go look at a table or graphs that tells you "for this type of modulation you need a S/N ratio of X dB to achieve Y BER), and you use that. This 2nd method is, of course, less accurate than an actual measurement.
In the real world, there are other factors other than S/N ratio that may cause additional bit errors, like a dispersive fade in the path loss, adjacent channel jamming, etc.