This crate contains reasonably effective
implementations of a number of common machine learning algorithms.
At the moment, rustlearn uses its own basic dense and sparse array types, but I will be happy
to use something more robust once a clear winner in that space emerges.
All the models support fitting and prediction on both dense and sparse data, and the implementations
should be roughly competitive with Python sklearn implementations, both in accuracy and performance.
use rustlearn::prelude::*;use rustlearn::ensemble::random_forest::Hyperparameters;use rustlearn::datasets::iris;use rustlearn::trees::decision_tree;let(data, target) = iris::load_data();letmut tree_params = decision_tree::Hyperparameters::new(data.cols());
tree_params.min_samples_split(10).max_features(4);letmut model = Hyperparameters::new(tree_params,10).one_vs_rest();
model.fit(&data,&target).unwrap();// Optionally serialize and deserialize the model// let encoded = bincode::serialize(&model).unwrap();// let decoded: OneVsRestWrapper<RandomForest> = bincode::deserialize(&encoded).unwrap();let prediction = model.predict(&data).unwrap();
Contributing
Pull requests are welcome.
To run basic tests, run cargo test.
Running cargo test --features "all_tests" --release runs all tests, including generated and slow tests.
Running cargo bench --features bench (only on the nightly branch) runs benchmarks.
请发表评论