Do you mean, when you draw more current from a power source, why does the voltage drop?
That is because the source (like everything) has internal resistance. When current flows, some of the voltage is needed to push current through that internal resistance. So, the voltage is 'dropped' across that internal resistance and you see what voltage is left at the output terminals.
As a simple (if not realistic) example, imagine a 12V battery with a 1 ohm internal resistance. If your circuit tries to draw 2 amperes from the battery, then by Ohm's law the voltage dropped across the internal resistor must be V=IR = 2x1 = 2V. So, 2 volts is 'lost' across the internal resistance and you will be left with just 10 volts coming out of the battery for your circuit.
In reality, it's a bit more complicated because the lower voltage will possibly make your circuit draw a lower current, especially if it is a simple load, like another resistor. In the real case, you would need to use the ratio of your load resistance and the battery internal resistance, acting as a potential divider, to work out what you would get.