Random Forest

A Random Forest is an ensemble of decision trees, where each tree is trained using a random subset of the training set (sampled with replacement) and a random subset of the available features (also sampled with replacement). This variety of features and samples makes Random Forests less prone to overfitting than single decision trees. A regression random forest averages the outputs from the component trees, whereas in classification the class with the most votes from the component trees is picked.
Related concepts:
Decision TreeEnsembleBagging