Ok, first what is device "A" and "B", it's type, part #,...? What are you using to measure signal at the input of device "B", if it's oscilloscope, what kind of probe are you using? Do you know input capacitance of your device "B", capacitance of oscilloscope probe (if that is used) and both source and sink current capability of output of your device "A"? What is your power supply, and what is signal level? Is this IC design or you are making circuit from components? Is clock single sided or differential?
There are huge number od clock driver IC that you could use, e.g.
https://www.onsemi.com/PowerSolutions/parametrics.do?id=112
Generally speaking, your clock signal has to charge input capacitance of your "B" device from low logic level (VL) to high logic level (VH) and back to low logic level (VL) in less time than 1 clock cycle or 1ns. In other words, charging time and discharging time of device "B" input capacitance between VL-VH-VL together has to be shorter than total clock cycle time. VH and VL are determined by your "B" device and it's power supply.
So device "A" minimum output current capability requirement will be:
I = 2fC(VH-VL)
This is if sink and source currents are same. Make note that you need to go higher than that to have some room for other type of errors.
You can easily calculate requirement for sink and source currents and you can measure total capacitance of probe and input of "B" device from dU, dt and separately measured output current capability of A device.
So now, if this is not clear, dont spare your fingers from keyboard and put as much information as you have that might be relevant.