Please be more clear about what you would like to know. You ask for a "general rule of thumb about input ripple voltage." This could mean anything!
For example, a general rule of thumb about input ripple voltage is: it is bad, and you don't want it to exist.
Another general rule of thumb is that it can either be completely filtered out by the DC to DC converter, or it can go straight through, depending on the frequency of the ripple, and the switching frequency of your DC to DC converter. If the ripple frequency is at the switching frequency of the converter, or if it is above the bandwidth of the converter, then a lot of the input ripple gets through to the output. This is generally bad. You can have 1V of ripple at 500Hz and 500kHz at the input--the 1V@500Hz could completely disappear at the output, but the 1V@500kHz could shoot straight through, or even get amplified. It all depends on not just the frequency of the ripple but the switching frequency of the converter and its bandwidth. The bandwidth is basically the speed that the converter can respond, and if the bandwidth is 100kHz, anything faster than 100kHz is too fast to be "dealt with" by the converter, so it travels through unimpeded (roughly).
There is no general rule of thumb about something as vague as "input ripple voltage when using a DC DC converter." Your question is a little too vague to answer well--can you please try to be more precise?