🔥 Furnace 0.3.5
Blazingly fast ML inference server powered by Rust and Burn framework
📊 Performance
- Binary Size: 2.3MB
- Inference Time: ~0.5ms
- Memory Usage: <50MB
- Startup Time: <100ms
🚀 Quick Start
Download Binary
Download the appropriate binary for your platform from the assets below.
Using Cargo
cargo install furnaceDocker
docker pull ghcr.io/Gilfeather/furnace:0.3.5📋 Full Changelog
See CHANGELOG.md for detailed changes.
What's Changed
- feat: Remove ONNX file loading functionality and simplify to built-in models by @Gilfeather in #26
- fix: add ResNet18 model download and create_sample_model binary to Docker build by @Gilfeather in #27
Full Changelog: v0.3.0...v0.3.5