Result: Fast learning of relational dependency networks

Title:
Fast learning of relational dependency networks
Source:
Machine Learning. 103:377-406
Publication Status:
Preprint
Publisher Information:
Springer Science and Business Media LLC, 2016.
Publication Year:
2016
Document Type:
Academic journal Article
File Description:
application/xml
Language:
English
ISSN:
1573-0565
0885-6125
DOI:
10.1007/s10994-016-5557-9
DOI:
10.48550/arxiv.1410.7835
Rights:
Springer TDM
arXiv Non-Exclusive Distribution
Accession Number:
edsair.doi.dedup.....2c68a3e18d341d97b201b8c2085f02dd
Database:
OpenAIRE

Further Information

A Relational Dependency Network (RDN) is a directed graphical model widely used for multi-relational data. These networks allow cyclic dependencies, necessary to represent relational autocorrelations. We describe an approach for learning both the RDN's structure and its parameters, given an input relational database: First learn a Bayesian network (BN), then transform the Bayesian network to an RDN. Thus fast Bayes net learning can provide fast RDN learning. The BN-to-RDN transform comprises a simple, local adjustment of the Bayes net structure and a closed-form transform of the Bayes net parameters. This method can learn an RDN for a dataset with a million tuples in minutes. We empirically compare our approach to state-of-the art RDN learning methods that use functional gradient boosting, on five benchmark datasets. Learning RDNs via BNs scales much better to large datasets than learning RDNs with boosting, and provides competitive accuracy in predictions.
17 pages, 2 figures, 3 tables, Accepted as long paper by ILP 2014, September 14- 16th, Nancy, France. Added the Appendix: Proof of Consistency Characterization