Well, many analog phase detectors have that "resolution", because they are....analog. The problem is that they have built in errors. A simple microwave mixer with a DC coupled IF port will give you a good phase detector output over a +/- 45 degree range. There will be some built in offset DC voltage (that will change vs power input) that you have to calibrate out, so the "accuracy" is not as good as you want.
There are I/Q mixers that will put out two signals that, combined, will give you the phase difference over the full 360 degree range. Those mixers tend to have DC offsets on both I/Q outputs, have difference "gains" for the two outputs, and might have some non-monotonic effects due to rf reflections inside of the device. These all effect the accuracy.
There are digital phase dectors that detect over a +/-360 degree range. They are more linear, but might have a dead zone somewhere in the range where the two input signals are almost the same phase.
Products - Hittite Microwave Corporation
IF you want to eliminate the errors, you will have to have some sort of calibration routine.
You could take any of the above schemes, use frequency multipliers on both signals first, and then do the phase detection. For example, use a X8 frequency multipliers on both signals, and then do the phase detection. You would have a smaller phase detection range, though. But your resolution would look like it was 8 times greater.
If you want much fewer errors, you will want to digitally sample the signal and do the phase detection in a DSP circuit. If it is a limited bandwidth input, you can downconvert the two signals with a common LO, and digitally sample it at a lower rate.