I think the previous answer is a good explanation of the two types of source.
A voltage or current source imply that the named factor is constant.
You may need some example, a 3w power led which is used very often in the last years, as you can see from the applied graph a small change in the voltage has a fairly large change in the current of the led, you see that a change of about 100-150mv makes the difference between 0.5A and 1A.
If you measure a specific led and set a voltage regulator to the exact voltage for 700mA, then if you try a second sample of the same led you will have a different current higher or lower because of the production tolerance or temperature difference.
Devices like this are best suited for constant current source in which you specify that you want the load to pull 700ma, this constant current source will change the output voltage according to the load so that the current will always be 700ma .
On the other hand consider a reference voltage for an a/d converter, what you need in this case is a very stable voltage so you use a constant voltage source.
Alex