Multitask Kernel-based Learning with First-Order Logic Constraints
Autor: | Diligenti, Michelangelo, Gori, Marco, Maggini, Marco, Rigutini, Leonardo |
---|---|
Rok vydání: | 2023 |
Předmět: | |
Zdroj: | Proceedings of The 20th International Conference on Inductive Logic Programming (ILP 2010) |
Druh dokumentu: | Working Paper |
DOI: | 10.48550/arXiv.2311.03340 |
Popis: | In this paper we propose a general framework to integrate supervised and unsupervised examples with background knowledge expressed by a collection of first-order logic clauses into kernel machines. In particular, we consider a multi-task learning scheme where multiple predicates defined on a set of objects are to be jointly learned from examples, enforcing a set of FOL constraints on the admissible configurations of their values. The predicates are defined on the feature spaces, in which the input objects are represented, and can be either known a priori or approximated by an appropriate kernel-based learner. A general approach is presented to convert the FOL clauses into a continuous implementation that can deal with the outputs computed by the kernel-based predicates. The learning problem is formulated as a semi-supervised task that requires the optimization in the primal of a loss function that combines a fitting loss measure on the supervised examples, a regularization term, and a penalty term that enforces the constraints on both the supervised and unsupervised examples. Unfortunately, the penalty term is not convex and it can hinder the optimization process. However, it is possible to avoid poor solutions by using a two stage learning schema, in which the supervised examples are learned first and then the constraints are enforced. Comment: The 20th International Conference on Inductive Logic Programming (ILP 2010). Florence, Italy. June 27-30 2010 |
Databáze: | arXiv |
Externí odkaz: |