Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American electrical engineer and mathematician, has been called "the father of information theory", and was the founder of practical digital circuit design theory.
|This scientist article is a stub. You can help Wikiquote by expanding it.|
- A few first rate research papers are preferable to a large number that are poorly conceived or half-finished. The latter are no credit to their writers and a waste of time to their readers.
- IRE Transactions on Information Theory (1956), volume 2, issue 1, page 3
- My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'
- Scientific American (1971), volume 225, page 180
- Explaining why he named his uncertainty function "entropy".
- I visualize a time when we will be to robots what dogs are to humans. And I am rooting for the machines.
- Omni Magazine (1987)
- Shannon's maxim: "The enemy knows the system."
- Reformulation of Kerckhoffs' principle
- Information is the resolution of uncertainty.