Classical information theory paved the way for the Internet, DVDs, iPads, and other innovations of today. The theory, whose foundations were laid in 1948 when Claude Shannon introduced a general mathematical theory of the inherent information content in data and its reliable communication in the presence of noise, also prompted a trillion-dollar communications industry windfall.
Yet while Shannon's Theory has had profound social and economic impacts, its application beyond storage and point-to-point communication to the Internet, for example poses one of the most vexing challenges for scientists and engineers today. To keep pace with rapid advances in networking, biology, and quantum information processing, we need to rethink how we understand and integrate information.
The Center for Science of Information advances the next generation of information theory through collaborative research and teaching. Supported by a National Science Foundation (NSF) grant entitled "Emerging Frontiers of Science of Information," the Center for Science of Information is the first NSF-funded Science and Technology Center in Indiana. By assimilating elements of space, time, structure, semantics, and context, we will deepen our understanding of information and apply these results to critical problems in society.
Specifically through the Center for Science of Information, we will:
To accomplish these objectives, we will sponsor workshops and seminars that encourage interdisciplinary collaborations. We also will engage in multi-institutional, grant-supported investigations that broaden our understanding of information. And, to create a pipeline for information careers of the future, we will create new interdisciplinary courses for both undergraduate and graduate students and establish fellowships to make graduate education more accessible.