1.2. AMOUNT OF INFORMATION

 Idi na Hrvatsku Stranicu  Back  Basic IT terms  Next


Message may or may not contain the information. For example, a person 'A' is considered a person 'B' hurling stones into the air. Person 'B' person says 'A' to the stone fell to the ground. Person 'A' message does not bring new information about the event, because it falls on the ground stone no doubt. Result of the event the person 'A' 100% known and probable, because otherwise it can not be. These messages contain no information. The amount of information in it is equal to zero.

But if a person 'A' flip a coin in the air and said person 'B' to the coin fell to the ground on one side, the message also contains no information because the person 'B' does not deprive the doubts about the results. 50% is possible that the coin falls to one side, and 50% is possible that the coin falls the other way. But the message that the coin falls and showed 'number' contains new information about the event and contains information. The amount of information in the message is greater than zero.

The message that describes the exact result of the uncertain event, contain a certain amount of information.


Example I

When throwing the dice and roulette ball, bowlers are watchers said the following messages:

A.) dice is shown one of the six numbers,

B.) dice showed the number three,

C.) dice showed the number seven,

D.) ball stopped at one of the numbers,

E.) ball took the number seven.

Individual messages to be added to the descriptions as follows:


A.) The message contains no information because it will surely show dice thrown one number. Event described in the above manner is 100% secure and is not uncertain. Likelihood of showing any number is:

        p = 1 (100%)

B.) The message contains the information because it eliminates the uncertainty regarding the results of dice. How dice has six equal sides, likely to show this number is:

        p = 1/6 (16.66%)

C.) The message contains a full description of the impossible. In this case the DEGRADATION information.


D.) The message contains no information because it is certain that it will fit on the ball field with one of the numbers. The event is by description identical A.) event.


E.) How roulette has a total of 37 fields, means that stopping the ball on the specified number of one of the 37 possible results of events. The likelihood of showing one number is:

        p = 1/37 (2.70%)

This example shows that the uncertainty of events affects the amount of information contained in the message. The amount of information is larger when the event uncertain. Knowing the outcome of events removes doubt, and VAGUENESS of events - ENTROPY. Increasing amounts of information reduces the entropy.


Example II

Three politicians walk one behind the other in a column. The first politician says: 'Before me no politician, and behind me were two politicians'. Another politician says: 'In front of me and behind me is the one politician'. The third politician says: 'In front of me were two politicians behind me were two politicians'. How is this possible? The extent to which these messages accurately describe the event and include the information? The answer is at the bottom of the summary.

Information does not contain a message that describes the event 100% sure, but it contains a message that describes an event that has at least TWO CHOICES, or event that is less than 50% probable. Event with two possible outcomes separates the state of existence of the state of lack of information, and it is taken as a MEASURE of the amount of information.

The amount of information is calculated by the expression:

 Amount of information

Measure the amount of information is called a BIT (BInary digiT) and the amount of information, 50% of likely events:

I = log2[1 / (1/2)] = log22 = 1 b (one bit)

BIT is like an electrical switch that can take one of two equally possible scenarios:

        turned on - a symbolic mark "1"
        turned off - a symbolic mark "0"

Logarithm to the base 2 was considered to simplify the amount of information amounts to 50% of the probable result of the event.

Message under B.) and E.) from the previous example would contain the amount of information:

B.)  I = log2[1 / (1/6)] = log26 = (log106) / (log102) = 2.585 b


E.)  I = log2[1 / (1/37)] = log237 = (log1037) / (log102) = 2.209 b


Example III

How much information in one place carries pulse sequence shown in Figure 1.1.3 ?

At one site at a time and can not be that there is a pulse, that is positive or negative, which symbolically represents the state of "1" or "0", and the amount of information in one place in a sequence of pulses:

I = 1 b (one bit)

BIT as the unit was quite unsuitable for practical use and introduced the concept of BYTE, a compound of the term 'BinarY TErm' as a label that refers to acts associated with a single character. In the early days of digital computing era of bits for one character was not the same for all types of computer systems, but today was the widely held notion that it is a collection of 8 bits.

AGREEMENT:

        8 b (bit-a) = 1 B = ONE BYTE 
        16 b = 2 B = WORD
        32 b = 4 B = LONG WORD 
        4 b = 1/2 B = HALF BYTE

Thus, two sequences of pulses is shown in Figure 1.1.3 represent the two BYTEs and the deployment of 'units' and 'zero' they can be summarized as follows:

 Binary sequences of characters

Each of the sequences can join 'code' that unambiguously defines ga and consists only of the digits "1" and "0" or uses a two-digit counting system. Zero bit 'b0' is the least significant bit and usually has a label LSB (Least Significant Bit) and will 'b7' has the greatest importance and usually has a label MSB (Most Significant Bit).

If the pulse duration at shorter transmit more bits per unit time. Number of bits in one second (bps = bits per second ; bit rate) velocity signaling expressed in the unit BAUD (symbol Bd):

 Baud rate; bit rate

Example IV

If the duration of the pulses 200 microseconds, which is the speed of signal transmission?

v = 1 / (200•10-6) = 1000000 / 200 = 5000 Bd

Definition of signal transmission speed can not be applied in this sense to the analog signals. Also, do not be confused with the speed of propagation of the signal through the medium, which is much higher in the medium and in terms of the data is not of importance. Baud rate is a measure of the signal only speed data transfer from source to destination.




SUMMARY:

Importance of INFORMATION very nicely described by a Croatian scientist 'Without substances there is nothing, no energy, nothing happens, no information, nothing makes sense' (J.Božičević 1995.).

The interior of the computers 'bustles' traffic of bits and bytes, which are used for practically everything. Some never change value (permanent memory), and some are continuous (working memory) or to 'walk' on the computer at the specified interconnection paths collected from multiple lines - BUS.

Exceptional importance of these terms is identified by the fact that the titles of the magazine IT profession just to them. Represent the fundamental concepts of informatics.

 Binary Switch  iStripper Affiliate
Figure* 1.2.1 Binary Switch.  

It is surprising that this set of "0" and "1" fundamentally changes human life today and sets new standards of literacy and success. As for the politicians, the message does not exactly describe the event as a third politician is lying, but contain information that indicates that every possible politician is a liar!


 Back
 Search
 Next

Citing of this page:
Radic, Drago. " Informatics Alphabet " Split-Croatia.
{Date of access}; https://informatics.buzdo.com/file.
Copyright © by Drago Radic. All rights reserved. | Disclaimer
 Content - Home
 Content  Informatics Alphabet