Reconciling Shannon and Scott with a Lattice of Computable Information
Autor: | Hunt, Sebastian, Sands, David, Stucki, Sandro |
---|---|
Rok vydání: | 2022 |
Předmět: | |
Zdroj: | Proc. ACM Program. Lang. 7(POPL), 2023, 68:1-68:30 |
Druh dokumentu: | Working Paper |
DOI: | 10.1145/3571740 |
Popis: | This paper proposes a reconciliation of two different theories of information. The first, originally proposed in a lesser-known work by Claude Shannon, describes how the information content of channels can be described qualitatively, but still abstractly, in terms of information elements, i.e. equivalence relations over the data source domain. Shannon showed that these elements form a complete lattice, with the order expressing when one element is more informative than another. In the context of security and information flow this structure has been independently rediscovered several times, and used as a foundation for reasoning about information flow. The second theory of information is Dana Scott's domain theory, a mathematical framework for giving meaning to programs as continuous functions over a particular topology. Scott's partial ordering also represents when one element is more informative than another, but in the sense of computational progress, i.e. when one element is a more defined or evolved version of another. To give a satisfactory account of information flow in programs it is necessary to consider both theories together, to understand what information is conveyed by a program viewed as a channel (\`a la Shannon) but also by the definedness of its encoding (\`a la Scott). We combine these theories by defining the Lattice of Computable Information (LoCI), a lattice of preorders rather than equivalence relations. LoCI retains the rich lattice structure of Shannon's theory, filters out elements that do not make computational sense, and refines the remaining information elements to reflect how Scott's ordering captures the way that information is presented. We show how the new theory facilitates the first general definition of termination-insensitive information flow properties, a weakened form of information flow property commonly targeted by static program analyses. Comment: 30 pages; presented at the 50th ACM SIGPLAN Symposium on Principles of Programming Languages (POPL 2023), 15-21 January 2023 |
Databáze: | arXiv |
Externí odkaz: |