Most of the cell phone chargers are 5V or 5.1V but the current rating veries from 500mA to 2A. Is it the charge current determined by the wall mount charger or internal circuitary of the cell phone? I belive that the charge current of the battery is determined by the charge management circuitary inside the cell phone, is it correct?
The "charger" nowadays often is USB style and delivers constant voltage.
The voltage usually is 5V, but according USB 3 standard it may be a higher voltage. But initially there is always 5V.
The charger tells the phone it's capabilities.
But the true charging circuit is in the phone. It controls the charging current as well as the input current limit.
Therefore with one charging time varies from one charger to the other.
The "charger" nowadays often is USB style and delivers constant voltage.
The voltage usually is 5V, but according USB 3 standard it may be a higher voltage. But initially there is always 5V.
The charger tells the phone it's capabilities.
But the true charging circuit is in the phone. It controls the charging current as well as the input current limit.
Therefore with one charging time varies from one charger to the other.
Regarding the charger telling the phone its current capabilities, how exactly is this done? I assume via the USB data lines?
What is the easiest way to broadcast the current capability? Seems like you would need a small uC with USB interface that would have to be programmed unless there are dedicated IC's that do this?
With non_USB_3.0 it is done by the fixed state of the data lines.
With a charger the datalines usually are unused. A dedicated charging port (DCP) connects both data lines to be recognized by the USB device
With USB 3.0 it is done with protocol (if I remember right).