ASCII codes
Most computer systems use a coding
system known as ASCII (American Code for Information Interchange) or something
very similar to it. All input to and from the computer is via these codes. For
instance, when you press a capital “A” on the keyboard a binary pattern of
01000001 (65 in decimal) is produced. A “B” would be the next binary pattern
up. ie 01000010 (66 in decimal). Most confusing is the fact that the number 0
is represented by 00110000 (48 in decimal) and so on.
Many ASCII codes are historical. For
instance ASCII code for 7 would originally have produced a bell ring on
teletypes, on more modern computers this is translated as a little “beep”.
There is another standard known as
ANSI which is used by some systems. Many of the codes are exactly the same as
ASCII, eg alphabetic characters.
Extended Binary Coded
Decimal Interchange Code (EBCDIC)
This was another method of coding
data invented by IBM in the early 1960’s from it’s punched card data format. To
call it a standard is not correct since there are at least 6 different formats
and is not used by anyone but IBM. In the computer industry it was suspected
that IBM developed it as a way of “locking in” existing customers into it’s own
technology. This should not be mentioned on a modern computing syllabus – but
it is.
Unicode
The original ASCII was only an 8-bit
standard and, not surprisingly, biased towards the English (or American)
speaking world. All the other character sets for other countries were largely
ignored. To get around this a 16-bit character
set standard, designed and maintained by non-profit consortium was
developed.
Originally Unicode was designed to
be universal, unique, and uniform, i.e., the code was to cover all major modern
written languages (universal), each character was to have exactly one encoding
(unique), and each character was to be represented by a fixed width in bits
(uniform).
No comments:
Post a Comment