|Item type||Current library||Call number||Status||Date due||Barcode||Item holds|
Title from title frames.
Originally produced by The Teaching Company/The Great Courses in 2015.
The science of information is the most influential, yet perhaps least appreciated field in science today. Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every kind. This revolution goes far beyond the limitless content that fills our lives, because information also underlies our understanding of ourselves, the natural world, and the universe. It is the key that unites fields as different as linguistics, cryptography, neuroscience, genetics, economics, and quantum mechanics. And the fact that information bears no necessary connection to meaning makes it a profound puzzle that people with a passion for philosophy have pondered for centuries. Little wonder that an entirely new science has arisen that is devoted to deepening our understanding of information and our ability to use it. Called information theory, this field has been responsible for path-breaking insights such as the following: What is information? In 1948, mathematician Claude Shannon boldly captured the essence of information with a definition that doesn't invoke abstract concepts such as meaning or knowledge. In Shannon's revolutionary view, information is simply the ability to distinguish reliably among possible alternatives. The bit: Atomic theory has the atom. Information theory has the bit: the basic unit of information. Proposed by Shannon's colleague at Bell Labs, John Tukey, bit stands for "binary digit"-0 or 1 in binary notation, which can be implemented with a simple on/off switch. Everything from books to black holes can be measured in bits. Redundancy: Redundancy in information may seem like mere inefficiency, but it is a crucial feature of information of all types, including languages and DNA, since it provides built-in error correction for mistakes and noise. Redundancy is also the key to breaking secret codes. Building on these and other fundamental principles, information theory spawned the digital revolution of today, just as the discoveries of Galileo and Newton laid the foundation for the scientific revolution four centuries ago. Technologies for computing, telecommunication, and encryption are now common, and it's easy to forget that these powerful technologies and techniques had their own Galileos and Newtons. The Science of Information: From Language to Black Holes covers the exciting concepts, history, and applications of information theory in 24 challenging and eye-opening half-hour lectures taught by Professor Benjamin Schumacher of Kenyon College. A prominent physicist and award-winning educator at one of the nation's top liberal arts colleges, Professor Schumacher is also a pioneer in the field of quantum information, which is the latest exciting development in this dynamic scientific field.
Mode of access: World Wide Web.