hello
1 KB = 1000 (decimal) Bytes and 1 KB = 1024 (binary) Byte. what is the difference and when to use each of them? and is the same applied for bits?
Regards
Thanx man I think this is the answer, i.e.: 2^10 = 1KB and 10^3 = 1KB.
In summary, 1KB = either 1000 Bytes ot 1024 Bytes depending on the base used, if decimal then 1KB=1000 Bytes, if binary (base 2) it is 1024 Bytes.
1KB=1024 bytes only... it is not the case of either 1000 or 1024...
since the difference between the two is comparatively small it has been used so... suppose you keep in mind the extra 24bytes then you would have to come up with a new naming for 1024... to make things easy 1024 is referred as 1KB...
In many areas of the PC, only binary measures are used. For example, "64 MB of system RAM" always means 64 times 1,048,576 bytes of RAM, never 64,000,000. In other areas, only decimal measures are found: a "28.8K modem" works at a maximum speed of 28,800 bits per second, not 29,491.
Storage devices however are where the real confusion comes in. Some companies and software packages use binary megabytes and gigabytes, and some use decimal megabytes and gigabytes.
The IEEE has proposed a new naming convention for the binary numbers, to hopefully eliminate some of the confusion: