By the same authors

GPU-Accelerated Hypothesis Cover Set Testing for Learning in Logic

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Standard

GPU-Accelerated Hypothesis Cover Set Testing for Learning in Logic. / Algahtani, Eyad; Kazakov, Dimitar Lubomirov.

CEUR Proceedings of the 28th International Conference on Inductive Logic Programming. CEUR Workshop Proceedings, 2018.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

Algahtani, E & Kazakov, DL 2018, GPU-Accelerated Hypothesis Cover Set Testing for Learning in Logic. in CEUR Proceedings of the 28th International Conference on Inductive Logic Programming. CEUR Workshop Proceedings.

APA

Algahtani, E., & Kazakov, D. L. (2018). GPU-Accelerated Hypothesis Cover Set Testing for Learning in Logic. In CEUR Proceedings of the 28th International Conference on Inductive Logic Programming CEUR Workshop Proceedings.

Vancouver

Algahtani E, Kazakov DL. GPU-Accelerated Hypothesis Cover Set Testing for Learning in Logic. In CEUR Proceedings of the 28th International Conference on Inductive Logic Programming. CEUR Workshop Proceedings. 2018

Author

Algahtani, Eyad ; Kazakov, Dimitar Lubomirov. / GPU-Accelerated Hypothesis Cover Set Testing for Learning in Logic. CEUR Proceedings of the 28th International Conference on Inductive Logic Programming. CEUR Workshop Proceedings, 2018.

Bibtex - Download

@inproceedings{2598f5c61e464891bd20b1b05176c3f4,
title = "GPU-Accelerated Hypothesis Cover Set Testing for Learning in Logic",
abstract = "ILP learners are commonly implemented to consider sequentially each training example for each of the hypotheses tested. Computing the cover set of a hypothesis in this way is costly, and introduces a major bottleneck in the learning process. This computation can be implemented more efficiently through the use of data level parallelism. Here we propose a GPU-accelerated approach to this task for propositional logic and for a subset of first order logic. This approach can be used with one’s strategy of choice for the exploration of the hypothesis space. At present, the hypothesis language is limited to logic formulae using unary and binary predicates, such as those covered by certain types of description logic. The approach is tested on a commodity GPU and datasets of up to 200 million training examples, achieving run times of below 30ms per cover set computation.",
author = "Eyad Algahtani and Kazakov, {Dimitar Lubomirov}",
year = "2018",
language = "English",
booktitle = "CEUR Proceedings of the 28th International Conference on Inductive Logic Programming",
publisher = "CEUR Workshop Proceedings",

}

RIS (suitable for import to EndNote) - Download

TY - GEN

T1 - GPU-Accelerated Hypothesis Cover Set Testing for Learning in Logic

AU - Algahtani, Eyad

AU - Kazakov, Dimitar Lubomirov

PY - 2018

Y1 - 2018

N2 - ILP learners are commonly implemented to consider sequentially each training example for each of the hypotheses tested. Computing the cover set of a hypothesis in this way is costly, and introduces a major bottleneck in the learning process. This computation can be implemented more efficiently through the use of data level parallelism. Here we propose a GPU-accelerated approach to this task for propositional logic and for a subset of first order logic. This approach can be used with one’s strategy of choice for the exploration of the hypothesis space. At present, the hypothesis language is limited to logic formulae using unary and binary predicates, such as those covered by certain types of description logic. The approach is tested on a commodity GPU and datasets of up to 200 million training examples, achieving run times of below 30ms per cover set computation.

AB - ILP learners are commonly implemented to consider sequentially each training example for each of the hypotheses tested. Computing the cover set of a hypothesis in this way is costly, and introduces a major bottleneck in the learning process. This computation can be implemented more efficiently through the use of data level parallelism. Here we propose a GPU-accelerated approach to this task for propositional logic and for a subset of first order logic. This approach can be used with one’s strategy of choice for the exploration of the hypothesis space. At present, the hypothesis language is limited to logic formulae using unary and binary predicates, such as those covered by certain types of description logic. The approach is tested on a commodity GPU and datasets of up to 200 million training examples, achieving run times of below 30ms per cover set computation.

M3 - Conference contribution

BT - CEUR Proceedings of the 28th International Conference on Inductive Logic Programming

PB - CEUR Workshop Proceedings

ER -