I'd call both of these "low".
The noise type matters some. Phase noise of equal absolute
jitter would bother the "high" rate more, as it consumes or
corrupts a larger percentage of the available eye-space.
Voltage noise probably bothers them equally. But again the
noise character matters; "glitches" might have only a
probabilistic interference when they coincide with a clock
edge, and then the low rate signal would have 4X better
odds of missing a given interfering glitch.
But this smells like a homework question, which wants
your professor's or textbook's opinion and not mine.