Limit this search to....

Formal Theories of Information: From Shannon to Semantic Information Theory and General Concepts of Information 2009 Edition
Contributor(s): Sommaruga, Giovanni (Editor)
ISBN: 3642006582     ISBN-13: 9783642006586
Publisher: Springer
OUR PRICE:   $52.24  
Product Type: Paperback - Other Formats
Published: April 2009
Qty:
Annotation: This book presents the scientific outcome of a collaboration of the Computer Science departments of the Universities of Berne, Fribourg and NeuchA[tel. Under the title "Information and Knowledge" these research groups collaborated over several years on issues of logic, probability, inference and deduction. The goal of this volume is to examine whether there is some common ground between different approaches to the concept of information.

The structure of this book could be represented by a circular model: The innermost syntactical circle, comprising statistical and algorithmic approaches; the second, larger circle, the semantical one, in which "meaning" enters the stage; finally the outermost circle, the pragmatic one, casting light on real-life logical reasoning.

Those articles are nestled in two philosophical contributions exploring the wide conceptual field as well as taking stock of the articles on the various formal theories of information.

Additional Information
BISAC Categories:
- Computers | Information Theory
- Computers | Computer Science
- Mathematics | Logic
Dewey: 003.54
Series: Lecture Notes in Computer Science
Physical Information: 0.7" H x 6.1" W x 9.3" (0.92 lbs) 269 pages
 
Descriptions, Reviews, Etc.
Publisher Description:
It is commonly assumed that computers process information. But what is inf- mation? In a technical, important, but nevertheless rather narrow sense, Sh- non'sinformationtheorygivesa?rstanswertothisquestion.Thistheoryfocuses on measuring the information content of a message. Essentially this measure is the reduction of the uncertainty obtained by receiving a message. The unc- tainty of a situation of ignorance in turn is measured by entropy. This theory hashad an immense impact on the technologyof information storage, data c- pression, information transmission and coding and still is a very active domain of research. Shannon's theory has also attractedmuch interest in a more philosophic look at information, although it was readily remarked that it is only a "syntactic" theory of information and neglects "semantic" issues. Several attempts have been made in philosophy to give information theory a semantic ?avor, but still mostly based on or at least linked to Shannon's theory. Approaches to semantic informationtheoryalsoveryoftenmakeuseofformallogic.Thereby, information is linked to reasoning, deduction and inference, as well as to decision making. Further, entropy and related measure were soon found to have important connotations with regard to statistical inference. Surely, statistical data and observation represent information, information about unknown, hidden para- ters. Thus a whole branch of statistics developed around concepts of Shannon's information theory or derived from them. Also some proper measurements - propriate for statistics, like Fisher's information, were proposed.