By the same authors

From the same journal

Improving over-fitting in ensemble regression by imprecise probabilities

Research output: Contribution to journalArticlepeer-review



Publication details

JournalInformation Sciences
DateE-pub ahead of print - 23 Apr 2015
DatePublished (current) - 1 Oct 2015
Number of pages14
Pages (from-to)315-328
Early online date23/04/15
Original languageEnglish


In this paper, generalized versions of two ensemble methods for regression based on variants of the original AdaBoost algorithm are proposed. The generalization of these regression methods consists in restricting the unit simplex for the weights of the instances to a smaller set of weighting probabilities. Various imprecise statistical models can be used to obtain a restricted set of weighting probabilities, whose sizes each depend on a single parameter. For particular choices of this parameter, the proposed algorithms reduce to standard AdaBoost-based regression algorithms or to standard regression. The main advantage of the proposed algorithms compared to the basic AdaBoost-based regression methods is that they have less tendency to over-fitting, because the weights of the hard instances are restricted. Several simulations and applications furthermore indicate a better performance of the proposed regression methods in comparison with the corresponding standard regression methods.

    Research areas

  • Regression, AdaBoost algorithm, Over-fitting, Linear-vacuous mixture model, Kolmogorov-Smirnov bounds

Discover related content

Find related publications, people, projects, datasets and more using interactive charts.

View graph of relations