As an example, lets say your spectrum analyzer was set to have a resolution bandwidth of 1 KHz. That means it is letting in 1000 times as much power to the power detector than if it was set to a 1 Hz resolution bandwidth.
You make a measurement using the marker function, and at a particular frequency point in the spectrum the spectrum analyzer says that there is -50 dBm at that point. What is it saying to you? It is saying that at that frequency point, the AVERAGE power across a +/- 500 Hz bandwidth was -50 dBm. Since 1000 Hz is 1000 times a 1 Hz bandwidth, and 1000 is a 30 dB power ratio, then the actual power at that frequency spot in dBm/Hz will be -50 dBm/1000Hz = -80 dBm/Hz.
Note that if the slope of the spectrum is varying too much, you will get a distored view of the dBm/Hz, since it averages over the entire RBW window of frequencies.
(note, since the spectrum analyzer does not have a "brick wall" filter inside of its IF, when the RBW is set at some number, you get a little more than that thru the filter. If you read your manual carefully, you will find a calibration fudge factor (a dB or two) to compensate for this. When you use a spectrum analyzer's "phase noise measurement function", it applies that fudge factor automatically for you).