onnxruntime

High-performance ML model inference engine for multiple platforms

brewmacoslinux
Try with needOr install directly
Source

About

Cross-platform, high performance scoring engine for ML models

Commands

onnxruntime

Examples

Run inference on an ONNX model with input data$ onnxruntime model.onnx --input input.pb
Convert a model to ONNX format for inference$ onnxruntime convert --model model.pb --output model.onnx
Benchmark model performance$ onnxruntime benchmark model.onnx --warmup 5 --runs 100