Convert the input number in 16-Bit Binary to BCD and show it on the LCD(8051).
Somehow I do have the assembly code from the internet which is working fine but i do not understand the math behind that. Also i found nothing regarding the math behind 16-Bit Binary to BCD on the internet.
I found a link on how to convert 8-Bit Binary to BCD and it says the 16 bit and 32 bit to BCD conversion follow the same pattern. Although its easy to understand the 8-Bit binary to BCD conversion I found it quite difficult to find the pattern for the 16-Bit and 32-Bit binary numbers.
Find a more verbose explanation of the so called "double dabble" algorithm here, including a C implementation. https://en.wikipedia.org/wiki/Double_dabble
It works for any word width.
Although its easy to understand the 8-Bit binary to BCD conversion I found it quite difficult to find the pattern for the 16-Bit and 32-Bit binary numbers.