information (v. lat.: informare = form, a form give) are potenziell or actually existing usable or used sample of subject and/or forms of energy, which is relevant within a certain context for a viewer. Substantiallyfor the information are the recognizing barness as well as the piece of news content. The used sample changes the condition of a viewer - in the human connection in particular its knowledge. Information the removal of uncertainty is more formal and/or. the removal of an uncertainty by information, report,Notification or knowledge of articles and phenomena.

Table of contents

characteristics of the information term

information is today invery extensively more used and therefore also very heavily which can be defined term. Different sciences regard the information as their sphere of activity, in particular computer science, the communication theory and the information science, the communications technology, the Informationsökonomik and the semiotic.

Only in youngerTime gives it to efforts to connect the individual beginnings and come to a generally valid information term. Appropriate literature is at present usually under the keyword philosophy (for instance within the range theory of knowledge) in the shelves. From a standardized, generally acceptedTheory of the information cannot be spoken provisionally yet.

Generally linguistic usage as well as in some sciences (semiotic, information sciences) is equated „information “withmeaning “or „transferred knowledge “. A more reduced aspect of the term, which is today from great practical importance(Computer engineering), originates from the communications technology. The trailblazing theory is that there from Claude Shannon. He regards the statistic aspects of the indications in a code, which represents information. The meaning of the information goes with Shannon only implicitly in the probabilitiesthe used indications, which can be determined in the long run only with help of humans, since only humans are able the meaning of a code consciously to seize and from not meaningful code can differentiate meaningful. The directA goal of its considerations is the optimal transmission of information in a message channel (Telefonie, radio).

The term information and other terms from the communication theory often become in the everyday linguistic usage and also in the natural sciences in a metaphorical wayused. A direct assumption of the term information in scientific theories, as is used it in the engineer sciences, is however in general. illegally. Reason for this is that the engineering sciences are aligned to humans in the long run and therefore humans asUser or producer of artificial systems themselves part of the views to be can, with which the used terms often contain a purposeful and, teleologische component aligned to human consciousness. In contrast to this it is a goal of the natural sciences of describing nature as independently as possible from humans to.Thus must again be defined with assumption of information-theoretical terms this only in one from teleologischem additives released version. Thus for example by the term „genetic code is understood “in the genetics a quantity about rules, which describes purely physicochemical processes, throughwhich DNA - Structures into protein structures to be transferred, and not an agreement of conscious natures over the use of symbols for the change of messages, as the term is usually understood „code “in the communication theory. The renouncement of such teleologischen terms inthe natural sciences does not have to exclude thereby for the goal „teleologische world explanations from the beginning “, but serves of preventing false conclusions with which only apparently new realization from a scientific theory is won, which however in reality by inadäquaten use of the terms beforeinto the theory one inserted. In particular is this is also a method, of which some Pseudowissenschaften partly avail themselves. Thus for example the science philosopher warned Wolfgang bar Mueller before a rereviving of the Neovitalismus by inadequate use of information-theoretical terms in biology.

Itit cannot be excluded however that in the future the scientific structure term and the information term can be led back one on the other. Thus for instance neuro computer science examines the relationship of neural structures of the brain and its ability to process information.

In this article one tries,the different levels statistics to differentiate structure and meaning and deal with the purchases between these levels.

structure and meaning

an aspect proceed from the storage medium. The question, which structure can be determined within this carrier, becomesexamined.

Another beginning strives to understand, which meaning comes, which one took then (somehow) this storage medium.

The first aspect has its roots in the communications technology, second in the cognitive science, the linguistics or generally inthe Geisteswissenschaft. A communications-engineering-wise recognizable structure (for example light pulses, which meet in a temporal order individual cells in the retina) must be translated in a complex decoding process into a meaning.

Where the pure structure information stops here and begins, oneMeaning information to become, where thus during this decoding process the border is to be pulled to consciousness, is one of the exciting questions of the information and cognitive sciences.

From these views four levels result, under which the term of the information generally regards todaybecomes. These are

  1. coding
  2. syntax
  3. semantics
  4. pragmatics

these levels increase regarding the meaning content of the information. They reflect thereby also the theoretical points of attack mentioned above, whereby the coding level of the aspect of the communications technology close comes, the syntax level thoseAspect of linguistics or those the theory of the formal languages shows, which semantic level of beginnings from the semiotic or semantics integrates, and which falls back pragmatics rather to concepts of the cognitive sciences.

The four levels are at the character sequence „IT ARE WARM “are described:

code level

the character sequence „IT IS WARM “is too short for a statistic view. With longer texts it becomes however clear that not all elements seem to the character sequence (letter) directly frequently. Certain letters like for example thoseLetters e and t - in our example however s - are more frequent than others. This fact can be used during the information transfer around transmission time to save. As example the Huffman codes are mentioned. They represent a procedure, with thatInformation to be efficiently conveyed and stored can. Many further procedures exist. On this level also questions about the choice of optimal codes for a certain purpose are interesting (coding, ASCII - code, university University of, Braille, flag alphabet, Genetic code, delta coding,…).

syntactic level of the information

on the syntactic level is seen information only as structure, which applies to convey it. Contents of the information are here essentially uninteresting. For example that could Problem of consist transferring the picture of a camera on a monitor. The transmission system is for example not interested thereby in it, whether it is at all worth the picture, to be transferred (burglar makes himself at the window to create) or not (catit runs to window CIM along) or whether anything at all is to be recognized (also the picture of a completely in a diffuse way adjusted camera is completely transferred, although there is nothing recognizable to actually see there). The information content is thereby a measure for the maximum Efficiency, with which will loss-free transmit the information can.

distinction and information content

basic principle of the syntactic information is the distinction: Information contains, which can be differentiated. A distinction presupposes however at least two different possibilities.

Givesit exactly two possibilities, then can be clarified the distinction with only one yes/no question. Example: Assumed, on a bill of fare there are only two courts, shreds and Spaghetti. We know, one of the two courts the guest ordered. In order to find out, whichhe ordered, needs one to him only one question to ask: „Credit you shreds orders?„The answer reads „“, then it ordered a shred, reads the answer „no “, then it has Spaghettiordered.

However more if than two possibilities are present, then one can find out nevertheless by means of Yes-No questions, which alternative applies. A simple possibility would be to query simply after the order all courts. However is a quite inefficient method: If the guest stillno order gave up, needs one a great many asking, in order to find it out. It is more efficient, if one asks for example only: „Credit you already orders? '“, in order to become more concrete then, „it was a court with meat?“, „was it Schweinefleisch?“, so that only few alternatives remain remaining finally („it was pig shred?“, „pig roasts?“, „Schweinshaxe?“). The sequence of the questions reflects the priority of the bits in a so coded message again. The information content of one Message corresponds to the number of Yes-No questions, which one needs with an ideal question strategy, in order to reconstruct it.

Also the probabilities play a role with an optimal question strategy: If one knows for example the fact that half of all guests orders pig shreds then is it surely meaningfully to ask only once for pig shred before one goes through the remainder of the map.

Here that vordergründig no semantic or pragmatic information is used, this is interesting however implicitly in form of the probability is received. For example is thoseFact that pig shreds order 50 per cent of the guests not to recognize from the bill of fare; it is pragmatic information. And that one wishes you a good appetite normally not after the order of „we “asks, it follows from thatsemantic information that this no meal is, and it therefore it is most improbable that someone orders this.

Binarisierung and the probability of indication

the character sequence „IT ARE WARM “contain only capital letters. Wenn wir einmal nur davon ausgehen,that we would have available only capital letters (thus 27 letters including blanks), then can we at each of the eleven places of the above message one of the 27 indications set. Each place of the message has thus 27 possible conditions. The code, thatwe use here, have thus 27 places.

Is from great technical importance however the binary code. Each code is represented by a consequence by bits. A bit differentiates only between two possible conditions, which one represents by one and zero.So that we can represent 27 different conditions, we need several bits - in this case exactly five. Thus one can differentiate 2 to the power of 5 = 32 conditions.

An obvious, possible binary code looks as follows:

A 00001 B 00010 C00011 D 00100 E 00101…. ..... <LZ> our message „

00101 10011 11100 01001 10011 10100 11100 would then be called 11100 (blank)… 01101“.

Now the above coding of the letters is not in five yes/no decisions the alone valid. Within the framework thatclassical communication theory the information sequence from statistic view is regarded. So can be considered, as frequently a certain indication of the character set is used, in other words, as probable his occurrence is. Like that for example the letter is „E “in German more frequentas the letter „Y “.

If one considers this probability of occurrence of the indications in the character set, then one can make the number of necessary yes/no decisions, which are necessary for recognizing an indication, of various sizes depending upon indications. One calls such coding also entropy coding. One needs in order to code a frequently arising indication, fewer bits, than for a rarely arising indication. An indication has thus one all the higher information content (a higher number of “atomic” decision units, of bits needs) for the recognition, everit arises more rarely.

See also: Entropy (communication theory)

semantic level of the information

structured one, syntactic information become only usable, by being read and interpreted. That is, to the structure level the meaning level must come. In addition a certain reference system must put onwill, in order to be able to transfer the structures into a meaning. One calls this reference system code. In the above example one must know thus, what means „warmly “.

However the transfer of syntax is rarely so direct in semantics; in thatRule is processed the information over a great many different codes ever higher semantic level: On the different semantic levels again data processing on structural-syntactic level is carried out: The light pulses, which meet straight your retina, are registered there by nerve cells(Meaning for the nerve cell), to the brain passed on, into a spatial connection brought, as letters recognized, to words joined. During this whole time nerve impulses (thus structure information) “are shot” from a brain cell to the next, to itself in this way intheir consciousness by words only insufficiently wiedergebbaren the terms for „warmly “, „now “, and „here “to form begin, which have a meaning then in the connection: They know now that it concerns with these words the statement that it warmly (and) is not cold.


  • Structure information is transferred during a decoding process in semantics (meaning).
  • Structure information is transferred gradually over codes into other structure information, whereby on the different semantic stages meaning for the processing system develops in each case.

See also: Coding, communication (communication theory)

pragmatic level of the information

this comes the colloquial information term next. The statement that it is warm (which semantically correct we interpreted now; we know, what wants to say this message to us), hasa genuine information character, if we at noon around twelve after a night through-caroused still half drowsy to consider itself, what we it are to tighten, and us the friend with the words „it is warm “holds to slip into the Rollkragenpullover. The pragmatic information contentzero are however alike - resemble semantically accurately - to the statement, if we already sit and sweat in the T-Shirt on the balcony. This information does not offer anything new to us.Smalltalk is obvious a kind of information exchange, with that over thoseLanguage exchanged semantic information as well as no pragmatic information represent - are the body signals, their semantics (friendliness, dislike) important we recognize here and pragmatically (he/they likes me?) to use can.

In this pragmatic sense substantial criterion is from information thatit the subject, which takes up the information, changes, which means concretely that the information, which can be potentially inferred from the subject changes.


  • Information leads to a profit at knowledge.
  • Information makes the decrease possible of uncertainty.
  • Information is transferably; in the form of data and/or. Signals
  • information an event is, that the condition of the receiver and/or. System changes.

purchases between the levels

if one the phenomenon information regarded, are to be regarded the four levels in the connection.So that information takes place, agreements on every four levels are necessary.

Also the semantic processing (for example summarizing from letters to words) manufactures again syntactic information (a succession of word delimiters). In the long run also the pragmatic level defines itself not leastbecause it must create new information of syntactic nature (otherwise the information would not have effect unfolded). Due to close interaction between semantic decoding process and effect development in the pragmatics, which both again syntactic information generates as END and intermediate products, becomesometimes these two levels also to the Semantopragmatik merged.

communication model of the information

the understanding of the syntactic level was characterized long time by the transceiver model: A transmitter wants to communicate information to the receiver. In addition it codes its informationaccording to certain principles (for example as succession of zeros and ones according to the principle mentioned above) into a storage medium, the receiver evaluates this storage medium, because also it knows the code, and receives thereby the information (see also:Communication).

Nothowever a human transmitter is always present, which wants to communicate us somewhat. A typical example is the measurement: It, figurativy spoken, is completely no matter to the physical system, which humans think from it. The goal of the measurement is an information transferfrom the measured system to that, which accomplishes the measurement (one measures to experience around something over the measured system).

An example is the speed measurement by radar trap: The car did not betray an intention, its speed too (and the drivers usuallyalso not). The policeman wins nevertheless by the measurement information about the speed. For the production of the information a physical law was taken up used (the Doppler effect), by an engineer around the equipment to design. The police setsthe equipment and arranges thus that information is produced. The direct production of information however is delegated thereby to an apparatus.


  • So that information for humans becomes recognizable, subject or energy must exhibit a structure.
  • Syntactically correspondsInformation of the probability of occurrence of a certain symbol within a defined decoding pattern
  • information is in the communication model a spatial or temporal consequence of physical signals, which arise with certain probabilities or frequentnesses.
  • The information content of a message results from the number of yes/no possibilities, forin the message the one of the values is fixed.

See also: Information transfer (physics)

information transport, emergence and destruction

interesting are it that information, which is bound as storage mediums at subject up and/or. by electromagnetic waves to be transferred can.This information can, since massless, then in principle with speed of light are transported. Finally the information can be bound again back at subject structures. An example of so a transmission process is the fax. The information of a certain document becomes alsoSpeed of light over large distances transported and at the goal to a second document with accurately the same information contents transfer.

More generally: Around information to transport a storage medium is necessary.

Can information without loss be passed on? When copying software this is the case,because technical mechanisms (redundant codes/check totals) for it provide. Information cannot generally be passed on, without becoming less thereby. The extent of the loss depends on the physical boundary conditions. In accordance with Shannon knows during a transmission no more information outone in-gives to a channel to be taken as on the transmitter side. When passing on or copying of information it is actually not doubled however, but it is present then only redundantly .

In one as closed system which can be regarded information becomes thermodynamic in the long rundestroyed, at the latest with the heat death of the universe. In a thermodynamically open system information can be passed on, information-carrying structures can even spontaneously develop. Examples are a multiplicity of theoretical and experimentally examined dissipativen structures. Particularly spin - systems (Spin=Drehimpuls atomic and subatomicParticles), in particular the so-called Ising glasses, were very often examined, not least because of their relevance for the theory of neural nets. Many experiments even already show that in Ising Gläsern spontaneously structures can develop, those because of the quantized nature of the spinas information available into digitized form to be interpreted can, which e.g. the emergence conditions of the structure in coded form contains.

digital information

digital information results from digitization of arbitrary information. The result are data.

Although for the measurementfrom digital information capacities, for information stream and for the information storage the bit and the byte as fundamental units are present, the information capacity still gladly on the basis the respective storage medium are quantified. So one knows the digital information capacity, which is located in a book,read off easily and descriptive from the page number or from the number of the words.

See also: Binary system, artificial intelligence, symbolism

definition of the information in different fields

to the conclusion are here the individual technical and research directions tooWord come, which have ever their own understanding of the information. Thereby the respective beginning on the different, above described levels between the pure syntax up to the pragmatics becomes clear, partly also with the special stress of the transportation character of information.


the semiotic understand by information purpose-oriented data, which extend the knowledge. In older literature they are defined often still than purpose-oriented knowledge.

information science

the information science uses the term of the information similarly for the semiotic beginning.For it the terms are knowledge and information of central importance. Information is thereby knowledge transfer and/or „knowledge in action “. Information develops in this sense only punctually, if humans need knowledge (a certain knowledge unit) for problem solution. This knowledge unit goesas information from a knowledge supply into another over, for example from a data base into the knowledge supply of humans. Knowledge internally is represented, information is presented - to assist in the understanding for the information-looking for -. (Knowledge representation - information presentation).

See also: Information management

communication theory

the communication theory regards information as the opposite of the information entropy. Is regarded above all the information content of individual message, which after Claude Shannon by the statistic significance of individual symbols is defined.

A similar beginningthe entropy term is in the physics, which in thermodynamics and the statisischen mechanics one uses. In the statistic mechanics it is interpreted as term for the order of a system, which here however a pure statement about the structurethe system is.

information as economic goods

information can be regarded as economical property, since information in the enterprise produces by employment of other factors of production (humans, computer, software, communication, etc.), or to be from the outside purchased can. Information has thus oneWorth, which is tradable. The value results from the use of the information and the costs for production, supply and forwarding. Problematic here it is that the potenzielle buyer does not know the value of the information always in advance and her partlyonly after it acquired it, to evaluate can. The trade desired with information is afflicted already thereby with the problem of asymmetrical information.

Further one can understand information also as factor of production. Information is thus not only konsumptiv used, but can alsoproductively to be used.

documentation and Ordnungslehre

William of gau write in its work documentation and Ordnungslehre (1995) that information under different aspects can be regarded

  1. structure = structure approach
  2. realization = knowledge approach
  3. signal = signal approach
  4. message= message approach
  5. understood message = meaning approach
  6. knowledge wissensvermehrung = effect approach
  7. procedure = process approach

information as change

after the work citizens of Berlin of the computer scientist Peter Rüdiger: „Information is a change of concrete quantity and duration. “

OneDefinition of the information about change means a description of the information about physical effect. If a simple change is regarded as a mathematical element, which causes a condition change, then it can be proven that a quantity of such elements, the condition changes at the same„“And exhibit characteristics such as connection and repeatability cause object, a mathematical group represent, those as information concerning. the object one defines. This group permits a length regulation, which can be used for optimizations, because since change is consequence of physical effect,also the variation principle of the smallest effect applies. (Source: The definition of the information and the results)

a further mathematical description, which is based on the nature of the change, is the description of January Kahre: The Law OF Diminishing information. (Source: January Kahre:The Mathematical Theory OF information)

movement is also change. (Further) a definition of the information about change is therefore made by movement difference (information movement) and difference movement (Ruhepotentialität): “Information always exists only in the motion, those a complementary, relativeMovement is ". (Source:Jerg Haas: The cybernetics of nature: Komplementarität, ISBN 3-8311-1019-0)

related groups of topics

the term of the information is closely linked with questions in the group of topics knowledge. In addition belongs in particular the problem of the definitionof complexity, which can be described over the algorithmic depth of an information-processing process. Further for this views count over the difference between coincidence and order as well as the term of the distinction and the relevance.

In the algorithmic communication theory becameMeasure for determining the complexity of structures, e.g. in the special complexity of character strings, develops. This can be used, under certain conditions, also as measure for the information, which has advantages in some aspects opposite from Shannon.

Likewise importantlythe term of communication in this connection, which presupposes these the information term, is. Other around is it also like that, which it is frequently argued that communicating barness is a substantial characteristic of information.

see also


of lehr and special books

  • MartinWerner, Otto Mildenberger:Information and coding, Vieweg, August 2002, ISBN 3528039515
  • harsh ore Klimant, Rudi Piotraschke, Dagmar beautiful field: Information and coding theory, Teubner, March 2003, ISBN 3519230038
  • Holger Lyre: Communication theory, William finch publishing house, Munich 2002, ISBN 3-7705-3446-8 (introduction tothe communication theory with view on the current research Lyres for the quantum theory of the information. Knowledge of quantum physics is however presupposed.)
  • William of gau, W.: Documentation and Ordnungslehre. Berlin, Heidelberg: Springer, 1983 (5. Aufl. 2005. ISBN 3-540-23818-2)

popular-scientific booksto the information

  • Nørretranders, gate: Feel the world, Rowohlt, 1994, ISBN 3-4980-4637-3 (an understandable introduction to the world of the information, the entropy and consciousness)
  • Bieletzke, S., rough, H.L.: Departure into the information society, 2002, ISBN 3825838447

World-descriptive-speculative books for information

Web on the left of

Wikibooks: Over thatNature of the information - learning and teaching materials
Wiktionary: Information - word origin, synonyms and translations
Wikiquote: Information - quotations

  > German to English > (Machine translated into English)