Download PDF

Learning from Structured Data with Kernels, Neural Networks and Logic

Publication date: 2017-06-23

Author:

Orsini, Francesco

Abstract:

Real-world data often have a complex structure that can be naturally represented with graphs or logic. This thesis has four main contributions that address some challenges that arise when learning from structured data. First, we introduce graph invariant kernels (GIKs), a framework that upgrades the Weisfeiler-Lehman and other graph kernels to effectively exploit high-dimensional and continuous vertex attributes. Graphs are first decomposed into subgraphs. Vertices of the subgraphs are then compared by a kernel that combines the similarity of their labels and the similarity of their structural role using a suitable vertex invariant. By changing this invariant we obtain a family of graph kernels that includes generalizations of Weisfeiler-Lehman, NSPDK, and propagation kernels. We demonstrate empirically that these kernels obtain state-of-the-art results on relational datasets. Second, we introduce shift aggregate extract networks (SAEN) an architecture based on deep hierarchical decompositions to learn effective representations of large graphs. Our framework extends classic R-decompositions used in kernel methods, enabling nested part-of-part relations. Unlike recursive neural networks, which unroll a template on input graphs directly, we unroll a neural network template over the decomposition hierarchy, allowing us to deal with the high degree variability that characterize social network graphs. Deep hierarchical decompositions are also amenable to domain compression, a technique that reduces both space and time complexity by exploiting symmetries. We show empirically that our approach is competitive with current state- of-the-art graph classification methods, particularly when dealing with social network datasets. Third, we introduce kProbLog as a declarative logical language for machine learning. kProbLog is a simple algebraic extension of Prolog with facts and rules annotated by semiring labels. It allows to elegantly combine algebraic expressions with logic programs. We introduce the semantics of kProbLog, its inference algorithm, its implementation and provide convergence guarantees for a fragment of the language. We provide several code examples to illustrate its potential for a wide range of machine learning techniques. In particular, we show the encodings of state-of-the-art graph kernels such as Weisfeiler-Lehman graph kernels, propagation kernels and an instance of GIKs. However, kProbLog is not limited to kernel methods and it can concisely express declarative formulations of tensor-based algorithms such as matrix factorization and energy-based models, and it can exploit semirings of dual numbers to perform automatic differentiation. Furthermore, experiments show that kProbLog is not only of theoretical interest, but can also be applied to real-world datasets. At the technical level, kProbLog extends aProbLog (an algebraic Prolog) by allowing multiple semirings to coexist in a single program and by introducing meta-functions for manipulating algebraic values. Fourth, we provide a mathematical analysis of the preimage problem of the Weisfeiler- Lehman subtree kernel and show how to morph new graphs from graph datasets with a simple technique that employs off-the-shelf solvers. Preliminary results show that this technique is amenable for future constructive machine learning applications such as de novo synthesis of small molecules.