Information, computation, and logic are defining concepts of the modern era. Shannon laid the foundation of information theory, demonstrating that problems of communication and compression can be precisely modeled, formulated, and analyzed. Turing formalized computation defined as the transformation of information by means of algorithms. Godel established modern foundation of logic, laying the foundation for modern computer science and science of information.
Shannon's focus was originally on data recovery in compression and communication, but information is not merely communicated, it is also acquired, represented, inferred, processed, aggregated, managed, valued, secured, and computed. Computational information explores those properties of information that can be feasibly extracted. Existence of an object is of limited utility if no reasonable algorithm can provably generate such an object. Infeasibility may arise for a number of different reasons: the desired information may be computationally hard to extract; the information may be distributed geographically and not locally extractable; or information may be encoded in (quantum) physical ways that prevent full extraction. In contrast to the classical theory of information, where precise quantitative limits can be established in most cases, in the computational setting, information is not well understood qualitatively, with exponential gaps between the upper and lower bounds on the amount of feasibly extractable information.
We must add logic to this paradigm. At its most basic, logic is the study of consequence. The core intuition motivating including logic in information is that an informational state may be characterized by the range of possibilities or configurations that are compatible with the information available at that state. But logic may restrict range of possibility, directly impacting just information. Furthermore, logic ``unusual effectiveness in computer science'', from descriptive complexity to type theory (including Voevodsky univalent axiom) to reasoning about knowledge closes the loop from logic to information to computation. Understanding how to harness it in order to deepen connections to a theory of information remains very much an open question.
There are plenty of questions with very few satisfying answers: