Skip to content
Open Source Blog

Now available: ONNX Runtime 0.5 with support for edge hardware acceleration 

ONNX Runtime 0.5, the latest update to the open source high performance inference engine for ONNX models, is now available. This release improves the customer experience and supports inferencing optimizations across hardware platforms. Since the last release in May, Microsoft teams have deployed an additional 45+ models that leverage ONNX Runtime for inferencing. These models...

Read more