Pagina principaleGruppiConversazioniAltroStatistiche
Cerca nel Sito
Questo sito utilizza i cookies per fornire i nostri servizi, per migliorare le prestazioni, per analisi, e (per gli utenti che accedono senza fare login) per la pubblicità. Usando LibraryThing confermi di aver letto e capito le nostre condizioni di servizio e la politica sulla privacy. Il tuo uso del sito e dei servizi è soggetto a tali politiche e condizioni.

Risultati da Google Ricerca Libri

Fai clic su di un'immagine per andare a Google Ricerca Libri.

An introduction to information theory :…
Sto caricando le informazioni...

An introduction to information theory : symbols, signals & noise (originale 1961; edizione 1980)

di John Robinson Pierce

UtentiRecensioniPopolaritàMedia votiConversazioni
632236,900 (3.92)Nessuno
Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. Beginning with the origins of this burgeoning field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy, language and meaning, efficient encoding, and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. An Introduction to Information Theory continues to be the most impressive nontechnical account available and a fascinating introduction to the subject for lay listeners.… (altro)
Utente:kenf
Titolo:An introduction to information theory : symbols, signals & noise
Autori:John Robinson Pierce
Info:New York: Dover Publications, 1980. xii, 305 p. : ill. ; 22 cm. 2nd, rev. ed
Collezioni:La tua biblioteca
Voto:
Etichette:math, computer science, information theory

Informazioni sull'opera

An Introduction to Information Theory: Symbols, Signals, and Noise di John R. Pierce (1961)

Nessuno
Sto caricando le informazioni...

Iscriviti per consentire a LibraryThing di scoprire se ti piacerà questo libro.

Attualmente non vi sono conversazioni su questo libro.

Mostra 2 di 2
(Original Review, 1980-12-05)

Final answer to question, "How many joules to send a bit?"

The unit of information is determined by the choice of the arbitrary scale factor K in Shannon's entropy formula:

{ s(Q|X) = -K SUM(p*ln(p)) }

If K is made equal to 1/ln(2), then S is said to be measured in "bits" of information. A common thermodynamic choice for K is kN, where N is the number of molecules in the system considered and k is 1.38e-23 joule per degree Kelvin, Boltzmann's constant. With that choice, the entropy of statistical mechanics is expressed in joules per degree. The simplest thermodynamic system to which we can apply Shannon's equation is a single molecule that has an equal probability of being in either of two states, for example, an elementary magnet. In this case, p=.5 for both states and thus S=+k ln(2). The removal of that much uncertainty corresponds to one bit of information. Therefore, a bit is equal to k ln(2), or approximately 1e-23 joule per degree K. This is an important figure, the smallest thermodynamic entropy change that can be associated with a measurement yielding one bit of information.

The amount of energy needed to transmit a bit of information when limited by thermal noise of temperature T is:

E = kT ln 2 (Joules/bit)

This is derived from Shannon's initial work (1) on the capacity of a communications channel in a lucid fashion by Pierce (2), although it is not obvious that he was the first to derive it. This limit is the same as the amount of energy needed to store or read a bit of information in a computer, which Landauer derived (3) from entropy considerations without the use of Shannon's theorems. Pierce's book is reasonably readable. On page 192 he derives the energy per bit formula (Eq. 10.6), and on page 200 he describes a Maxwell Demon engine generating kT ln 2 of energy from a single molecule and showing that the Demon had to use that amount of energy to "read" the position of the molecule. Then on page 177 Pierce points out that one way of approaching this ideal signalling rate is to concentrate the signal power in a single, short, powerful pulse, and send this pulse in one of many possible time positions, each of which represents a different symbol. This is essentially the concept behind the patent (4) which led me to ask the original question. My thanks to those who helped with their replies.

REFERENCES

1. C. E. Shannon, "A Mathematical Theory of Communication", Bell
System Tech. J., Vol. 27, No. 3, 379-423 and No. 4, 623-656
(1948); re-printed in: C. E. Shannon and W. Weaver, "The
Mathematical Theory of Communication", University of Illinois
Press, Urbana, Illinois (1949).
2. J. R. Pierce, "Symbols, Signals and Noise", Harper, NY (1961)
3. R. Landauer, "Irreversibility and Heat Generation in the
Computing Process," IBM J. Res. & Dev., Vol. 5, 183 (1961).
4. R. L. Forward, "High Power Pulse Time Modulation
Communication System with Explosive Power Amplifier Means",
U. S. Patent 3,390,334 (25 June 1968).

[2018 EDIT: This review was written at the time as I was running my own personal BBS server. Much of the language of this and other reviews written in 1980 reflect a very particular kind of language: what I call now in retrospect a “BBS language”.] ( )
  antao | Nov 6, 2018 |
Brilliant and inspiring book. Enjoyed it immensely. Much use of highlighter. ( )
  jaygheiser | Jul 23, 2008 |
Mostra 2 di 2
nessuna recensione | aggiungi una recensione

» Aggiungi altri autori

Nome dell'autoreRuoloTipo di autoreOpera?Stato
John R. Pierceautore primariotutte le edizionicalcolato
Dorland, Cees vanTraduttoreautore secondarioalcune edizioniconfermato
Newman, James R.A cura diautore secondarioalcune edizioniconfermato
Devi effettuare l'accesso per contribuire alle Informazioni generali.
Per maggiori spiegazioni, vedi la pagina di aiuto delle informazioni generali.
Titolo canonico
Dati dalle informazioni generali inglesi. Modifica per tradurlo nella tua lingua.
Titolo originale
Titoli alternativi
Dati dalle informazioni generali inglesi. Modifica per tradurlo nella tua lingua.
Data della prima edizione
Personaggi
Luoghi significativi
Eventi significativi
Film correlati
Epigrafe
Dedica
Dati dalle informazioni generali inglesi. Modifica per tradurlo nella tua lingua.
To Claude and Betty Shannon
Incipit
Dati dalle informazioni generali inglesi. Modifica per tradurlo nella tua lingua.
In 1948 Claude E. Shannon published a paper called "A Mathematical Theory of Communication";[sic] it appeared in book form in 1949.
Citazioni
Ultime parole
Dati dalle informazioni generali inglesi. Modifica per tradurlo nella tua lingua.
(Click per vedere. Attenzione: può contenere anticipazioni.)
Nota di disambiguazione
Redattore editoriale
Elogi
Lingua originale
DDC/MDS Canonico
LCC canonico

Risorse esterne che parlano di questo libro

Wikipedia in inglese (1)

Behind the familiar surfaces of the telephone, radio, and television lies a sophisticated and intriguing body of knowledge known as information theory. This is the theory that has permitted the rapid development of all sorts of communication, from color television to the clear transmission of photographs from the vicinity of Jupiter. Even more revolutionary progress is expected in the future. Beginning with the origins of this burgeoning field, Dr. Pierce follows the brilliant formulations of Claude Shannon and describes such aspects of the subject as encoding and binary digits, entropy, language and meaning, efficient encoding, and the noisy channel. He then goes beyond the strict confines of the topic to explore the ways in which information theory relates to physics, cybernetics, psychology, and art. Mathematical formulas are introduced at the appropriate points for the benefit of serious students. J. R. Pierce worked for many years at the Bell Telephone Laboratories, where he became Director of Research in Communications Principles. An Introduction to Information Theory continues to be the most impressive nontechnical account available and a fascinating introduction to the subject for lay listeners.

Non sono state trovate descrizioni di biblioteche

Descrizione del libro
Riassunto haiku

Discussioni correnti

Nessuno

Copertine popolari

Link rapidi

Voto

Media: (3.92)
0.5
1
1.5
2 1
2.5
3 9
3.5 3
4 9
4.5
5 10

Sei tu?

Diventa un autore di LibraryThing.

 

A proposito di | Contatto | LibraryThing.com | Privacy/Condizioni d'uso | Guida/FAQ | Blog | Negozio | APIs | TinyCat | Biblioteche di personaggi celebri | Recensori in anteprima | Informazioni generali | 204,403,775 libri! | Barra superiore: Sempre visibile