#random-forest #ensemble #gradient-boosting #machine-learning

no-std sklears-ensemble

Ensemble methods for sklears: Random Forest, Gradient Boosting, AdaBoost

2 releases

new 0.1.0-alpha.2 Dec 23, 2025
0.1.0-alpha.1 Oct 13, 2025

#2113 in Machine learning


Used in sklears

MIT/Apache

6MB
137K SLoC

sklears-ensemble

Crates.io Documentation License Minimum Rust Version

Latest release: 0.1.0-alpha.2 (December 22, 2025). See the workspace release notes for highlights and upgrade guidance.

Overview

sklears-ensemble delivers bagging, boosting, stacking, voting, and random forest implementations with scikit-learn parity and Rust-first performance.

Key Features

  • Tree Ensembles: RandomForest, ExtraTrees, GradientBoosting, Histogram-based boosting, IsolationForest.
  • Linear/Stochastic Ensembles: Bagging, AdaBoost, Stacking, Voting, Snapshot ensembles, warm-starting.
  • GPU + SIMD: Accelerated split finding, batched inference, and quantized histograms.
  • AutoML Integration: Works with feature selection, model selection, and inspection crates for end-to-end workflows.

Quick Start

use sklears_ensemble::RandomForestClassifier;
use scirs2_core::ndarray::{array, Array1};

let x = array![
    [0.0, 1.0, 2.0],
    [1.0, 0.5, 2.1],
    [0.5, 2.0, 1.5],
];
let y = Array1::from(vec![0, 1, 0]);

let forest = RandomForestClassifier::builder()
    .n_estimators(500)
    .max_depth(Some(10))
    .n_jobs(-1)
    .bootstrap(true)
    .build();

let fitted = forest.fit(&x, &y)?;
let predictions = fitted.predict(&x)?;

Status

  • Included in the 11,292 passing workspace tests for 0.1.0-alpha.2.
  • Benchmarks demonstrate 5–30× faster training versus scikit-learn on medium to large datasets.
  • Roadmap items (GPU GradientBoosting, federated ensembles) live in this crate’s TODO.md.

Dependencies

~97MB
~1.5M SLoC