Edging Ahead When Learning On The Edge

“With the power of edge AI in the palm of your hand, your business will be unstoppable.

That’s what the marketing seems to read like for artificial intelligence companies. Everyone seems to have cloud-scale AI-powered business intelligence analytics at the edge. While sounding impressive, we’re not convinced that marketing mumbo jumbo means anything. But what does AI on edge devices look like these days?

Being on the edge just means that the actual AI evaluation and maybe even fine-tuning runs locally on a user’s device rather than in some cloud environment. This is a double win, both for the business and for the user. Privacy can more easily be preserved as less information is transmitted back to a central location. Additionally, the AI can work in scenarios where a server somewhere might not be accessible or provide a response quickly enough.

Google and Apple have their own AI libraries, ML Kit and Core ML, respectively. There are tools to convert Tensorflow, PyTorch, XGBoost, and LibSVM models into formats that CoreML and ML Kit understand. But other solutions try to provide a platform-agnostic layer for training and evaluation. We’ve also previously covered Tensorflow Lite (TFL), a trimmed-down version of Tensorflow, which has matured considerably since 2017.

For this article, we’ll be looking at PyTorch Live (PTL), a slimmed-down framework for adding PyTorch models to smartphones. Unlike TFL (which can run on RPi and in a browser), PTL is focused entirely on Android and iOS and offers tight integration. It uses a react-native backed environment which means that it is heavily geared towards the node.js world.

No Cloud Required

Right now, PTL is very early. It runs on macOS (though no Apple Silicon support), but Windows and Linux compatibility is apparently forthcoming. It comes with a handy CLI that makes starting a new project relatively painless. After installing and creating a new project, the experience is smooth, with a few commands taking care of everything. The tutorial was straightforward, and soon we had a demo that could recognize numbers.

It was time to take the tutorial further and create a custom model. Using the EMNIST dataset, we created a trained resnet9 model with the letters dataset using help from a helpful GitHub repo. Once we had a model, it was simple enough to use the PyTorch utilities to export the model to the lite environment. With some tweaks to the code (which live reloads on the simulator), it recognized characters instead of numbers.

We suspect someone a little more steeped in the machine learning world would be able to take this farther than us. PTL has other exciting demos, such as on-device speech recognition and live video segmentation and recognition. Overall the experience was easy, and the scenarios we were trying were relatively easy to implement.

If you’re already in a smartphone react-native world, PTL seems simple to integrate and use. Outside of that, a lot is left unsupported. Tensorflow Lite was similarly constrained when we first covered it and has since matured and gained new platforms and features, becoming a powerful library with many supported platforms. Ultimately, we’ll see what PyTorch Live grows into. There’s already support for GPUs and neural engines in the beta branch.

10 thoughts on “Edging Ahead When Learning On The Edge

  1. If the animations are any indication of it, there’s something horribly wrong with the training or evaluation… Even a resnet9 should be enough to be pretty much at human level on just numbers… Nothing should be confusing that 4 with a 7 or 2…

  2. I bought a couple of “Edge Computing” boards/capes/shields/hats/whatever for the Raspberry Pi when Seeed introduced them a few years ago. I have yet to connect them.

  3. A win for users? It’s like having a fridge bot that drinks a pint of your milk every day to test that it’s still good and you haven’t run out yet. i.e. consumes your phones resources greedily with probably zero actual benefit.

    “But it’s better than having a milk inspector knock at your door twice a day to check it.” No, howabout you butt the F out of my milk situation altogether.

    1. I got those fridge bot, not drinkig milk but telling me I put it in wrong way.And nagging not to drink directly from bottle. Then the bot multiplied and milk requirements tripled.I don’t mind only becouse they carry my surname.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.