EnsembleBase: Extensible Package for Parallel, Batch Training of Base Learners for Ensemble Modeling

Extensible S4 classes and methods for batch training of regression and classification algorithms such as Random Forest, Gradient Boosting Machine, Neural Network, Support Vector Machines, K-Nearest Neighbors, Penalized Regression (L1/L2), and Bayesian Additive Regression Trees. These algorithms constitute a set of 'base learners', which can subsequently be combined together to form ensemble predictions. This package provides cross-validation wrappers to allow for downstream application of ensemble integration techniques, including best-error selection. All base learner estimation objects are retained, allowing for repeated prediction calls without the need for re-training. For large problems, an option is provided to save estimation objects to disk, along with prediction methods that utilize these objects. This allows users to train and predict with large ensembles of base learners without being constrained by system RAM.

Version: 1.0.2
Depends: kknn, methods
Imports: gbm, nnet, e1071, randomForest, doParallel, foreach, glmnet, bartMachine
Published: 2016-09-13
Author: Alireza S. Mahani, Mansour T.A. Sharabiani
Maintainer: Alireza S. Mahani <alireza.s.mahani at gmail.com>
License: GPL-2 | GPL-3 [expanded from: GPL (≥ 2)]
NeedsCompilation: no
Materials: ChangeLog
CRAN checks: EnsembleBase results

Documentation:

Reference manual: EnsembleBase.pdf

Downloads:

Package source: EnsembleBase_1.0.2.tar.gz
Windows binaries: r-devel: EnsembleBase_1.0.2.zip, r-release: EnsembleBase_1.0.2.zip, r-oldrel: EnsembleBase_1.0.2.zip
macOS binaries: r-release (arm64): EnsembleBase_1.0.2.tgz, r-oldrel (arm64): EnsembleBase_1.0.2.tgz, r-release (x86_64): EnsembleBase_1.0.2.tgz, r-oldrel (x86_64): EnsembleBase_1.0.2.tgz
Old sources: EnsembleBase archive

Reverse dependencies:

Reverse depends: EnsembleCV, EnsemblePCReg, EnsemblePenReg

Linking:

Please use the canonical form https://CRAN.R-project.org/package=EnsembleBase to link to this page.