Random Forest is an ensemble learning method for classification, regression and other tasks, that operate by constructing a multitude of descision trees at training time and outputing the class that is the mode of classes (classification) or mean prediction (regression) or the individual trees.
RGP is a simple modular Genetic Programming (GP) system build in pure R this system supports Symbolic Regression by GP through the familiar R model formula interface, GP individuals are represented as R expressions, an (optional) type system enables domain-specific function sets containing functions of diverse domain- and range types and is a basic set of genetic operators for variation (mutation and crossover) and selection is provided.
tgp is a Bayesian nonstationary, semiparametric nonlinear regression and design by treed Gaussian processes (GPs) with jumps to the limiting linear model (LLM) in special cases also implemented include Bayesian linear models, CART, treed linear models, stationary separable and isotropic GPs, and GP single-index models.
bst: Gradient Boosting is a Functional gradient descent algorithm for a variety of convex and non-convex loss functions, for both classical and robust regression and classification problems.
CORElearn is a suite of machine learning algorithms written in C++ with R interface that contains several machine learning model learning techniques in classification and regression these methods can be used for example to discretize numeric attributes. Its additional feature is OrdEval algorithm and its visualization used for evaluation of data sets with ordinal features and class, enabling analysis according to the Kano model of customer satisfaction.
evtree is an Evolutionary Learning of Globally Optimal Trees that commonly used classification and regression tree methods like the CART algorithm are recursive partitioning methods that build the model in a forward stepwise search.
frbs is an implementation of various learning algorithms based on fuzzy rule-based systems (FRBSs) for dealing with classification and regression tasks it allows to construct an FRBS model defined by human experts.
mboost function as gradient descent algorithm (boosting) for optimizing general risk functions utilizing component-wise (penalised) least squares estimates or regression trees as base-learners for fitting generalized linear, additive and interaction models to potentially high-dimensional data.
partykit: A Toolkit for Recursive Partytioning with infrastructure for representing, summarizing, and visualizing tree-structured regression and classification models, this unified infrastructure can be used for reading/coercing tree models from different sources ('rpart', 'RWeka', 'PMML') yielding objects that share functionality for print()/plot()/predict() methods.
rgenoud is a function that combines evolutionary search algorithms with with derivative-based (Newton or quasi-Newton) methods to solve difficult optimization and it can be sed for optimization problems for which derivatives do not exist.
Rmalschains it is a package that implements an algorithm family for continuous optimization called memetic algorithms with local search chains (MA-LS-Chains), memetic algorithms are hybridizations of genetic algorithms with local search methods suited for continuous optimization.
RSNNS is a Neural Networks in R using the Stuttgart Neural Network Simulator (SNNS) a library containing many standard implementations of neural networks, this package wraps the SNNS functionality to make it available from within R. Using the RSNNS low-level interface, all of the algorithmic functionality and flexibility of SNNS can be accessed and contains a convenient high-level interface, so that the most common neural network topologies and learning algorithms integrate seamlessly into R.