Hi,
I had done a project that needed this kind of conversion from 24bit binary to a decimal number which was latter converted to ascii for dispaly on an lcd. Can some one elaborate the algo rather than give code links as algo can be adopted for any processor family (pic, avr or 8051). I tried the booths algo for this but couldn't make much sense about the booths algo!!!
Its easy to convert a binary to decimal using paper and pencil as number are expressed as sum of successive powers of 2^n where n takes diffrent from 0 to 23 (in case of a 24 bit number)
Asimov