Goodreads helps you follow your favorite authors. Be the first to learn about new releases!
Start by following John R. Pierce.

John R.  Pierce John R. Pierce > Quotes

 

 (?)
Quotes are added by the Goodreads community and are not verified by Goodreads. (Learn more)
Showing 1-30 of 58
“The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“A branch of electrical theory called network theory deals with the electrical properties of electrical circuits, or networks, made by interconnecting three sorts of idealized electrical structures:”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“communication theory grew out of the study of electrical communication,”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“The lesson provided by Morse’s code is that it matters profoundly how one translates a message into electrical signals. This matter is at the very heart of communication theory.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Theories become more mathematical or abstract when they deal with an idealized class of phenomena or with only certain aspects of phenomena.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“In fact, network theory might have been developed to explain the behavior of mechanical systems,”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Thus, an increase in entropy means a decrease in our ability to change thermal energy, the energy of heat, into mechanical energy. An increase of entropy means a decrease of available energy.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Further, if there is cybernetics, then someone must practice it, and cyberneticist has been anonymously coined to designate such a person.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“The circuits used in the transmission of electrical signals do not change with time, and they behave in what is called a linear fashion.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“He showed that any variation of a quantity with time can be accurately represented as the sum of a number of sinusoidal variations of different amplitudes, phases, and frequencies. The quantity concerned might be the displacement of a vibrating string, the height of the surface of a rough ocean, the temperature of an electric iron, or the current or voltage in a telephone or telegraph wire. All are amenable to Fourier’s analysis.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Fourier analysis, which makes it possible to represent any signal as a sum of sine waves of various frequencies.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Maxwell’s equations are more general than network theory,”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“This is because the phenomena in man’s machines are simplified and ordered in comparison with those occurring naturally, and it is these simplified phenomena that man understands most easily.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“radio waves, which lies outside of the scope of network theory.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Thus, in general the shape of the output signal will be different from the shape of the input signal. However, the difference can be thought of as caused by the changes in the relative delays and amplitudes of the various components, differences associated with their different frequencies. If the attenuation and delay of a circuit is the same for all frequencies, the shape of the output wave will be the same as that of the input wave; such a circuit is distortionless.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“The amounts of the attenuation and delay depend on the frequency of the sine wave.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“While linearity is a truly astonishing property of nature, it is by no means a rare one. All circuits made up of the resistors, capacitors, and inductors discussed in Chapter I in connection with network theory are linear, and so are telegraph lines and cables. Indeed, usually electrical circuits are linear, except when they include vacuum tubes, or transistors, or diodes, and sometimes even such circuits are substantially linear.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“In a linear electrical circuit or transmission system, signals act as if they were present independently of one another; they do not interact. This is, indeed, the very criterion for a circuit being called a linear circuit.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“This measure of amount of information is called entropy. If”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“He noted that the line speed, and hence also the speed of transmission, was proportional to the width or extent of the range or band (in the sense of strip) of frequencies used in telegraphy; we now call this range of frequencies the band width of a circuit or of a signal. Finally, in analyzing one”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“R. V. L. Hartley, the inventor of the Hartley oscillator, was thinking philosophically about the transmission of information at about this time, and he summarized his reflections in a paper, “Transmission of Information,” which he published in 1928.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Euclidean geometry”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Morse code had been devised by 1838. In this code, letters of the alphabet are represented by spaces, dots, and dashes. The space is the absence of an electric current, the dot is an electric current of short duration, and the dash is an electric current of longer duration.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Fourier succeeded in proving a theorem concerning sine waves which astonished his, at first, incredulous contemporaries. He showed that any variation of a quantity with time can be accurately represented as the sum of a number of sinusoidal variations of different amplitudes, phases, and frequencies.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“Finally, Hartley stated, in accord with Nyquist, that the amount of information which can be transmitted is proportional to the band width times the time of transmission. But this makes us wonder about the number of allowable current values, which is also important to speed of transmission. How are we to enumerate them?”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“it was useless at the receiver,”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“To talk about information theory without communicating its real mathematical content would be like endlessly telling a man about a wonderful composer yet never letting him hear an example of the composer’s music.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“information in terms of the number of binary digits rather than in terms of the number of different messages that the binary digits can form. This would mean that amount of information should be measured, not by the number of possible messages, but by the logarithm of this number.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise
“The entropy of communication theory is measured in bits. We may say that the entropy of a message source is so many bits per letter, or per word, or per message. If the source produces symbols at a constant rate, we can say that the source has an entropy of so many bits per second.”
John Robinson Pierce, An Introduction to Information Theory: Symbols, Signals and Noise

« previous 1
All Quotes | Add A Quote
An Introduction to Information Theory: Symbols, Signals and Noise (Dover Books on Mathematics) An Introduction to Information Theory
811 ratings
Open Preview
Yellow Jack: How Yellow Fever Ravaged America and Walter Reed Discovered Its Deadly Secrets Yellow Jack
48 ratings
Open Preview