Photo by Chris Leipelt on Unsplash

Model Management and Optimization on Android

When building and optimizing machine learning models for Android deployment, there are a number of unique factors to consider. From resource constraints and optimizing inference speed, to model conversion and accessing advanced functionality, there can be quite a bit of legwork to get an ML model working effectively inside an Android app.

Given the inherent difficulties in managing and optimizing Android-ready ML models, we’ve worked to curate and assemble an authoritative collection of articles and tutorials that explore this range of concerns.

If you’d like to get started on your own journey with machine learning on Android, we’re here to help. Fritz AI is the machine learning platform for mobile developers, helping you teach your devices to see, hear, sense, and think. With pre-trained models, a powerful suite of developer tools, and SDKs for iOS, Android, and Unity, we make it easy to start building incredible mobile experiences with ML.

8-Bit Quantization and TensorFlow Lite: Speeding up mobile inference with low precision

Deploying efficient neural nets on mobile is becoming increasingly important. This post explores the concept of quantized inference, and how it works in TensorFlow Lite.

— by Manas Sahni

8-Bit Quantization and TensorFlow Lite: Speeding up mobile inference with low precision

Deploying PyTorch and Keras Models to Android with TensorFlow Mobile

Learn how to convert trained models to TensorFlow, add TensorFlow Mobile as a dependency in an Android app, and perform inference in your app with the TensorFlow model.

— by John Olafenwa

Deploying PyTorch and Keras Models to Android with TensorFlow Mobile

Machine learning on iOS and Android

Exploring machine learning on mobile with benefits, use cases, and developer environments

— by Austin Kodra

Machine learning on iOS and Android

Benchmarking TensorFlow Mobile on Android devices in production

Measuring TensorFlow Mobile runtime speeds across Android devices.

— by Jameson Toole

Benchmarking TensorFlow Mobile on Android devices in production

Profiling TensorFlow Lite models for Android

A look at profiling tools included in the recent release of TensorFlow Lite 1.0.

— by A Naveen Kumar

Profiling TensorFlow Lite models for Android

Running Artificial Neural Networks in Android using OpenCV

A step-by-step guide for building an artificial neural network (ANN) using OpenCV on Android.

— by Ahmed Gad

Running Artificial Neural Networks in Android using OpenCV

Creating a 17 KB style transfer model with layer pruning and quantization

Learn how to drastically shrink the size of your machine learning models using pruning and quantization.

— by Jameson Toole

Creating a 17 KB style transfer model with layer pruning and quantization

Editor’s Note: Heartbeat is a contributor-driven online publication and community dedicated to exploring the emerging intersection of mobile app development and machine learning. We’re committed to supporting and inspiring developers and engineers from all walks of life.

Editorially independent, Heartbeat is sponsored and published by Fritz AI, the machine learning platform that helps developers teach devices to see, hear, sense, and think. We pay our contributors, and we don’t sell ads.

If you’d like to contribute, head on over to our call for contributors. You can also sign up to receive our weekly newsletters (Deep Learning Weekly and Heartbeat), join us on Slack, and follow Fritz AI on Twitter for all the latest in mobile machine learning.