That depends of the external circuit. The current to measure must go through a resistor so it generate a potential difference between its terminals. Something to take into account is that the resistor voltage is not referenced to ground (GND) so you must use a differential amplifier and then measure its output with the ADC.
The resistor value depends of the current you want to measure, but it shouldn't be above 1 ohm (remind that current is measured in series with the circuit, so a high resistance could make it to work differently).
I think that a instrumentation opamp should be used because its high input impedance and precision.
Then, whatever you get from the A/D convertion will be directly proportional to the current flowing through the resistor. Depending of that resistor value you make a simple and well known count: I=V/R.
If your resistor is a 1 ohm one and you get 1024 as A/D conversion a 5V voltage (if VREF+ = 5V) should be measured between the resistor terminals, and making the count you will obtain a 5A current trough the resistor.
I know that you won't measure such a high current, but it's for you to see how a resistor can affect a circuit.
Think about the measuring range so you can adjust the reference voltages to get the more accurate measurement. 10-bit conversion is better than 8-bit one because you get more 'possible values', but it depends of your necessities.
About the graphical part, it depends of you, it can be in 7seg displays, LCD, PC (via USB or RS-232), GLCD, etc.