Popis: |
We present the logic of Hebbian learning, a dynamic logicwhose semantics1 are expressed in terms of a layered neuralnetwork learning via Hebb’s associative learning rule. Its lan-guage consists of modality Tφ (read “typically φ,” formalizedas forward propagation), conditionals φ ⇒ ψ (read “typi-cally φ are ψ”), as well as dynamic modalities [φ+]ψ (read“evaluate ψ after performing Hebbian update on φ”). We giveaxioms and inference rules that are sound with respect to theneural semantics; these axioms characterize Hebbian learningand its interaction with propagation. The upshot is that thislogic describes a neuro-symbolic agent that both learns fromexperience and also reasons about what it has learned. |