scorrpeio
Full Member level 5
I have successfully written a function which converts a number to its ASCII string to display on the LCD screen.
But, this function works only for integer number. I dont know how to convert floating number to its equivalent ASCII format.
One solution which I thought is multiplying the number by 10 and convert it to integer format and then pass it to NumToASCII function.
But, still it doesnt work properly because...
The converted ASCII String is stored in global array from 0th location.
So, in array 5.3 is stored as ['5', '3', NULL] afterwards I can put a decimal and make it as
['5', '.', '3', NULL]
But then, 35 is represented as ['3', '5', '0', NULL] .... 0 is there because number was multiplied by 10.
Now, '.' my function wont be able to put decimal point accurately, in this case
If I pass, 2345 to the above function, I will get in destination array as ['2','3','4','5', 0] which is right.
If I pass (2.3 *10) to the above function, I will get in destination array as ['2','3',0]. Then I will shift the position of array element to right and adjust decimal point at A[1].
If I pass (35 * 10) to the above function, I will get in destination array as ['3', '5', '0', 0]....now here if I put decimal point at A[1]...it will change the meaning of the number on display which is what I dont want.
Any solution??
---------- Post added at 07:58 ---------- Previous post was at 07:57 ----------
#define DEF_ASCII_VALUE_0 0x30
But, this function works only for integer number. I dont know how to convert floating number to its equivalent ASCII format.
One solution which I thought is multiplying the number by 10 and convert it to integer format and then pass it to NumToASCII function.
But, still it doesnt work properly because...
The converted ASCII String is stored in global array from 0th location.
So, in array 5.3 is stored as ['5', '3', NULL] afterwards I can put a decimal and make it as
['5', '.', '3', NULL]
But then, 35 is represented as ['3', '5', '0', NULL] .... 0 is there because number was multiplied by 10.
Now, '.' my function wont be able to put decimal point accurately, in this case
Code:
Display_NumToASCII ( volatile UINT8_T u8Value, volatile UINT32_T u32Value,
UINT8_T *ptrDest)
{
//The volatile variables which are greater than 10digits are
//stored as "u8Value 32Value" which effectively becomes 13 digits
//The float variables to be multiplied by a factor of 10 or 100 to shift
//digits at the right of decimal point to its left
//The above operation to be done in respective variable's parent .c file
//For ex, 123.43 should be multiplied by 100. 123.45 * 100 = 12345
//This value is then converted into BCD and stored in Array
//While retriving to display on LCD, a decimal point is adjusted to requierd position
if(u8Value)
{
if (u8Value > 100 )
{
//13th digit '9'999999999999 converted to BCD
*ptrDest++ = ((u8Value/100)| DEF_ASCII_VALUE_0 );
u8Value%=100;
}
if (u8Value > 10 )
{
//12th digit '9'99999999999 converted to BCD
*ptrDest++ = ( (u8Value/10) | DEF_ASCII_VALUE_0) ;
u8Value%=10;
}
*ptrDest++ = (u8Value| DEF_ASCII_VALUE_0);
}
if(u32Value > 1000000000)
{
//10th digit '9'999999999 converted to BCD
*ptrDest++ = ((u32Value/1000000000)| DEF_ASCII_VALUE_0 );
u32Value%=1000000000;
}
if(u32Value > 100000000)
{
//9th digit 9'9'99999999 converted to BCD
*ptrDest++ = ((u32Value/100000000)| DEF_ASCII_VALUE_0 );
u32Value%=100000000;
}
if(u32Value > 10000000)
{
//8th digit 99'9'9999999 converted to BCD and so on
*ptrDest++ = ((u32Value/10000000)| DEF_ASCII_VALUE_0 );
u32Value%=10000000;
}
if(u32Value > 1000000)
{
*ptrDest++ = ((u32Value/1000000)| DEF_ASCII_VALUE_0 );
u32Value%=1000000;
}
if(u32Value > 100000)
{
*ptrDest++ = ((u32Value/100000)| DEF_ASCII_VALUE_0 );
u32Value%=100000;
}
if(u32Value > 10000)
{
*ptrDest++ = ((u32Value/10000)| DEF_ASCII_VALUE_0 );
u32Value%=10000;
}
if(u32Value > 1000)
{
*ptrDest++ = ((u32Value/1000)| DEF_ASCII_VALUE_0 );
u32Value%=1000;
}
if(u32Value > 100)
{
*ptrDest++ = ((u32Value/100)| DEF_ASCII_VALUE_0 );
u32Value%=100;
}
if(u32Value > 10)
{
*ptrDest++ = ((u32Value/10)| DEF_ASCII_VALUE_0 );
u32Value%=10;
}
if(u32Value > 0)
{
*ptrDest++ = ( u32Value | DEF_ASCII_VALUE_0 );
}
*ptrDest++ = DEF_NULL;
//u32ErrorCode |= 0x0200;
}
If I pass, 2345 to the above function, I will get in destination array as ['2','3','4','5', 0] which is right.
If I pass (2.3 *10) to the above function, I will get in destination array as ['2','3',0]. Then I will shift the position of array element to right and adjust decimal point at A[1].
If I pass (35 * 10) to the above function, I will get in destination array as ['3', '5', '0', 0]....now here if I put decimal point at A[1]...it will change the meaning of the number on display which is what I dont want.
Any solution??
---------- Post added at 07:58 ---------- Previous post was at 07:57 ----------
#define DEF_ASCII_VALUE_0 0x30