I have a HMI which sends two 8-bit chars which represent a 16-bit touch screen coordinate.
So in an example, the HMI sends:
0X03 then 0X0E (both seperate) which represent the coordinate 030E (782).
How can i combine the two into a single variable to represent the number 782? I was thinking of creating a 'int' and shifting each into it some how.
ie.
char first = 0x03;
char second = 0x0E;
int number;
number=first;
number=number<<8;
---Now im stuck---
better if you declare all unsigned (unsigned char and unsigned int) on data declaration and data casting... so
Code:
unsigned char first = 0x03;
unsigned char second = 0x0E;
unsigned int number;
...
number = (unsigned int)first << 8 | second;
will have less problems...
{problem arise when you cast a signed char to int like 0xe0 for 'second', will be casted to 0xFFE0 and not 0x00E0... try it!}
It's not correctly handled by all compilers, but arithmetic type conversion is required for bitwise OR by the specification. Thank you for the correction.
you can store First(Low Byte) at the base address of the int variable then store second byte @ (int base address +1) then call/use 16 bit integer
use these macros