By the same authors

GPU-Accelerated Hypothesis Cover Set Testing for Learning in Logic

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Full text download(s)

Author(s)

Department/unit(s)

Publication details

Title of host publicationCEUR Proceedings of the 28th International Conference on Inductive Logic Programming
DateAccepted/In press - 7 Jul 2018
DateE-pub ahead of print (current) - 2018
PublisherCEUR Workshop Proceedings
Original languageEnglish

Abstract

ILP learners are commonly implemented to consider sequentially each training example for each of the hypotheses tested. Computing the cover set of a hypothesis in this way is costly, and introduces a major bottleneck in the learning process. This computation can be implemented more efficiently through the use of data level parallelism. Here we propose a GPU-accelerated approach to this task for propositional logic and for a subset of first order logic. This approach can be used with one’s strategy of choice for the exploration of the hypothesis space. At present, the hypothesis language is limited to logic formulae using unary and binary predicates, such as those covered by certain types of description logic. The approach is tested on a commodity GPU and datasets of up to 200 million training examples, achieving run times of below 30ms per cover set computation.

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations