Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American electrical engineer and mathematician, has been called "the father of information theory", and was the founder of practical digital circuit design theory.
|This scientist article is a stub. You can help Wikiquote by expanding it.|
- A few first rate research papers are preferable to a large number that are poorly conceived or half-finished. The latter are no credit to their writers and a waste of time to their readers.
- This duality can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past but cannot control it; we may control the future but have no knowledge of it.
- Coding theorems for a discrete source with a fidelity criterion. IRE International Convention Records, volume 7, pp. 142--163, 1959.
- My greatest concern was what to call it. I thought of calling it 'information,' but the word was overly used, so I decided to call it 'uncertainty.' When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, 'You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage.'
- Scientific American (1971), volume 225, page 180.
- Explaining why he named his uncertainty function "entropy".
- I visualize a time when we will be to robots what dogs are to humans. And I am rooting for the machines.
- Omni Magazine (1987).