Machine Learning On Tiny Platforms Like Raspberry Pi And Arduino

Machine learning is starting to come online in all kinds of arenas lately, and the trend is likely to continue for the forseeable future. What was once only available for operators of supercomputers has found use among anyone with a reasonably powerful desktop computer. The downsizing isn’t stopping there, though, as Microsoft is pushing development of machine learning for embedded systems now.

The Embedded Learning Library (ELL) is a set of tools for allowing Arduinos, Raspberry Pis, and the like to take advantage of machine learning algorithms despite their small size and reduced capability. Microsoft intended this library to be useful for anyone, and has examples available for things like computer vision, audio keyword recognition, and a small handful of other implementations. The library should be expandable to any application where machine learning would be beneficial for a small embedded system, though, so it’s not limited to these example applications.

There is one small speed bump to running a machine learning algorithm on your Raspberry Pi, though. The high processor load tends to cause small SoCs to overheat. But adding a heatsink and fan is something we’ve certainly seen before. Don’t let your lack of a supercomputer keep you from exploring machine learning if you see a benefit to it, and if you need more power than just one Raspberry Pi you can always build a cluster to get your task done just a little bit faster, too.

Thanks to [Baldpower] for the tip!

25 thoughts on “Machine Learning On Tiny Platforms Like Raspberry Pi And Arduino

        1. I installed openVINO update on 20 th Dec – have you any idea how to instigate it on the raspberry pie? Maybe your friend knows? I can’t see any relevant documentation yet.

  1. This is LITERALLY the opposite of the “Promise of Machine Learning”.
    We are SUPPOSED to be training on something powerful, so that we can run the result on something with as little power as possible.

    WHY would I want to train on an Arduino, when it takes hours or DAYS to train things on a multi-GPU CUDA setup?

    1. This is Micro$oft we are talking about, who are *literally* suing companies that don’t pay them ‘protection’ for using Linux. Whatever they do, it is to benefit M$ *AND* to screw someone else over. They have always been very efficient that way.

  2. Machine Learning is a big umbrella that includes computationally intensive methods like deep convolutional neural networks, to simple linear regression which can be solved in one step. There are other algorithms that also aren’t computationally intensive. Any computer or phone can do “machine learning”. It’s the type that matters. And there is zero chance a raspberry pi can train deep neural networks effectively.

    1. I’m guessing this is just transfer learning, where they take a pre-trained neural network model and then only train the last layer for your particular data set. Lots of limitations with this

  3. We can already use OpenCV / TensorFlow / Caffe etc standard industry tools and run inference on a Raspberry Pi.
    The Movidius hardware is useful if you want faster inference performance on something like a Raspberry Pi.

    So what has Microsoft bought to the table that is useful and novel?

    1. Maybe it’s just an educational tool? Maybe it can create filters like Kalman for combining data from different sources? There’s got to be some reason why this thing was created.

  4. Are there efficient ways to run a pre trained network a microcontroller? From my point of view the network could be represented as arrays and that shouldn’t be too complicated to transfer to a microcontroller?

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.