There is another reason
Traditional regulators first convert AC to DC. Then, it lower down the DC value to a voltage suitable for the equipment.
For example, the output of a transformer, if, after the diode/capacitor stage, is 10V DC, and you need to lower this down to 5V DC, this mean that half of the power will be lost during that conversion. So, if the device pump 1 amp @ 5V, it generate 5Watts. This mean that 5Watts would also be lost during the conversion from 10V to 5V (lost in the form of extra heat dumped by a heat sink).
If you were to take 120V or 230V AC, and convert this to DC immediately, without transmformer, then, the loss would be considerable when lowering it to 5V. This mean, in the end, that you would be paying your electricity bill mostly for heating your room instead of powering your device.
The newer power supply use a switching technology. This is a far more complex technology so that's why you don't see it often. It cost more, but it's much more efficient. The concept is that instead of 'dumping' the extra power, during the voltage lowering, in the form of heat, it use a different process. The process is hard to explain without a schematic to domonstrate, but it use the electro-magnetic property of an inductor coil to regulate the voltage. When the transistor is 'active', the voltage raise at the output, and at the same time, an electro-magnetic field is created around the inductor coil. When the transistor is 'inactive', the extra power that is stored in the form of electromagnetic field slowly collapse and create current through the inductor coil. Thus, even if the input DC voltage is applied only periodically to the regulator (usually with a cycle in the 40KHz to 1~2MHz range), there is constant current flowing through the regulator. Much less energy is lost because the extra power isn't loss through heat, but is constantly stored in the form of an electromagnetic field, and then constently recycled.