ONNX Runtime now supports building mobile applications in C# with Xamarin. Support for Android and iOS is included in the ONNX Runtime release 1.10 NuGet package. This enables C# developers to build AI applications for Android and iOS to execute ONNX models on mobile devices with ONNX Runtime.
ONNX Runtime is the open source project that is designed to accelerate machine learning across a wide range of frameworks, operating systems, and hardware platforms. We had previously introduced ONNX Runtime Mobile as the product targeting smartphones and other small storage devices.
In today’s release 1.10 we are enabling support for developers to build cross-platform applications targeting Android and iOS using Xamarin.Forms. This package supports the execution of ONNX models on the device’s CPU using the default CPU Execution Provider. Some phone hardware contains specialized hardware (NPUs) for efficient ML execution via the NNAPI (Android) and CoreML (iOS) interfaces. With the ONNX Runtime Mobile package, developers can choose to use the NNAPI (Android) or CoreML (iOS) Execution Providers that are available in the package. Developers should test to compare the performance between the provider options to determine the optimal solution for the application and model execution. See NNAPI Options and Core ML Options for more details to optimize and tune performance and accuracy when using the platform-specific accelerators.
Xamarin is an open source app platform for building modern and performant iOS and Android apps with C# and .NET. With support for Android and iOS in the NuGet feed, developers can infuse AI in their applications with ONNX Runtime.
We have added a new sample for building a mobile application in Xamarin. This example runs an image classifier using ResNet to classify among well-known images. It is part of the onnxruntime-inference-examples repo. This sample uses the official ONNX Runtime NuGet package.
Add the ONNX Runtime package in your project:
PM> Install-Package Microsoft.ML.OnnxRuntime -Version 1.10.0
Include ONNX Runtime package in your code:
To use the NNAPI Execution Provider in Android set:
To use the CoreML Execution Provider in iOS set:
Getting started: This Xamarin blog post demonstrates the basic steps for building Xamarin.Forms applications integrating with ONNX Runtime to build AI solutions.
ONNX Runtime 1.10
Version 1.10 is the latest release of the high-performance and cross-platform inference engine. Other notable updates include:
- Performance improvements: new quantized kernels on X64 and ARM64 amongst other optimizations
- Hardware flexibility: updates for TensorRT, DirectML, OpenVINO, and DNNL Execution Providers
- For NVIDIA GPUs, the ORT Python GPU package now includes both CUDA and TensorRT providers, making it easier for users to test or use either
- Mac universal2 build: simplifies deployments targeting Macs as universal2 allows a single binary to work across M1 custom Apple silicon and Intel-based
- Linux ARM64 now included in the Nuget package for .NET users
- ONNX Runtime Web: support for WebAssembly SIMD for improved performance for quantized models
About ONNX Runtime Mobile
ONNX Runtime Mobile is a build of the ONNX Runtime inference engine targeting Android and iOS devices. With this package, developers can build smartphone applications optimized for smaller disk footprint. Smaller application packages are also useful to comply with size requirements for Android and iOS App Stores. There are two flavors of the ONNX Runtime Mobile package—the prebuilt package containing a subset of ONNX operator set focused on the mobile AI scenarios or a custom package that can be generated based on the specific models used by the AI application. Refer to these steps to generate the custom packages for Android and iOS. The prebuilt packages are available for various platforms and language bindings. Refer to the matrix of options for the package corresponding to specific platform-language bindings.