1. Data Representation In Computer
- Data Representation
Computer processes data and data becomes information when it is presented in a format that people can understand and use. The format is defined as ass the physical arrangement of elements on a documents page. Computers take instructions for processing. The instructions that tell computers how to carry out the processing tasks are referred to as computer programs.
While processing computer converts all the data entered by the user (i.e. in decimal format) to electrical switches. A Switch has only two takes "ON" and "OFF" and take only two value 0 and 1 known as binary values. To represent any number which is greater than 1, various combinations of binary digits are used.
Each switch in the computer, either it is 1 & 0 is known as Bit. A bit is the smallest possible unit of data.
After a bit, the next larger unit is the byte.
1 Byte = 8 Bits
With one byte, the computer can represent up to 256 different values including all the letters, numbers, and other symbols.
1 Kilobyte (KB) = 1024 Bytes
1 Megabyte (MB) = 1024 Kiobytes
1 Gegabyte (GB) = 1024 Megabyte
1 Terabyte (TB) = 1024 Gegabyte
1 Petabyte = 1024 Terabyte
1 Exabyte = 1024 Petabyte
1 Zettabyte = 1024 Exabyte
1 Yoltabyte = 1024 Zettabyte
- Text Codes
The Most popular system developed to represent the numbers, characters, punctuation marks, and special symbol are:-
1. EBCDIC
2. ASCII
3. UNICODE
- EBCDIC
Stands For Extended Binary Coded Decimal Interchange Code. It was developed by IBM. EBCDIC is an 8-bit code that defines 256 symbols.
- ASCII
Stands for American Standard Code For Information Interchange. It includes
0-31 ⟶ Control Characters
32-64 ⟶ Special Character
65-90 ⟶ Uppercase Alphabets
97-122 ⟶ Lowercase Alphabets
123-127 ⟶ Common Symbols
128-255 ⟶ Different Character Set
ASCII is a 7-bit code that specifies character up to 127.
- UNICODE
It provided 2 bytes that are 16 bits to represent each symbol. Its range could be 65,536 different characters or symbols.
0 Comments