Machine learning is starting to come online in all kinds of arenas lately, and the trend is likely to continue for the forseeable future. What was once only available for operators of supercomputers has found use among anyone with a reasonably powerful desktop computer. The downsizing isn’t stopping there, though, as Microsoft is pushing development of machine learning for embedded systems now.
The Embedded Learning Library (ELL) is a set of tools for allowing Arduinos, Raspberry Pis, and the like to take advantage of machine learning algorithms despite their small size and reduced capability. Microsoft intended this library to be useful for anyone, and has examples available for things like computer vision, audio keyword recognition, and a small handful of other implementations. The library should be expandable to any application where machine learning would be beneficial for a small embedded system, though, so it’s not limited to these example applications.
There is one small speed bump to running a machine learning algorithm on your Raspberry Pi, though. The high processor load tends to cause small SoCs to overheat. But adding a heatsink and fan is something we’ve certainly seen before. Don’t let your lack of a supercomputer keep you from exploring machine learning if you see a benefit to it, and if you need more power than just one Raspberry Pi you can always build a cluster to get your task done just a little bit faster, too.
Thanks to [Baldpower] for the tip!
There’s also the AIY Vision kit.
Modivius is old, now it’s Neural Compute Stick 2
True, hopefully stick 2 will be ported to Raspberry Pi very soon.
It was actually ported to run on the Pi on December 19th, to my surprise. So you can already use it. I know PINTO already has. I’m planning on it soon.
I installed openVINO update on 20 th Dec – have you any idea how to instigate it on the raspberry pie? Maybe your friend knows? I can’t see any relevant documentation yet.
Is it machine learning, or just inference?
its PR
Proportional representation?
Just inference because the actual learning requires a lot of processing power. Inference is an outcome of learning though.
I know why I would bother to do this. $10 RPi Zero W > $79 Neural Compute Stick 2 for my budget. Game on!
It’s not at all clear to me how to use Arduino with this.
This is LITERALLY the opposite of the “Promise of Machine Learning”.
We are SUPPOSED to be training on something powerful, so that we can run the result on something with as little power as possible.
WHY would I want to train on an Arduino, when it takes hours or DAYS to train things on a multi-GPU CUDA setup?
I would even ask, why would they talk about Arduinos and Pis in the same sentence when they are in completely different classes?! I mean really, run a CNN on an Arduino??
To impress girls of course ;)
This is Micro$oft we are talking about, who are *literally* suing companies that don’t pay them ‘protection’ for using Linux. Whatever they do, it is to benefit M$ *AND* to screw someone else over. They have always been very efficient that way.
Machine Learning is a big umbrella that includes computationally intensive methods like deep convolutional neural networks, to simple linear regression which can be solved in one step. There are other algorithms that also aren’t computationally intensive. Any computer or phone can do “machine learning”. It’s the type that matters. And there is zero chance a raspberry pi can train deep neural networks effectively.
I’m guessing this is just transfer learning, where they take a pre-trained neural network model and then only train the last layer for your particular data set. Lots of limitations with this
I used some early neural network programs on an Amiga 8MHz 6800, 16bit so it should be possible to train anything. However I’ll be using a Neural Compute Stick 2, thanks.
We can already use OpenCV / TensorFlow / Caffe etc standard industry tools and run inference on a Raspberry Pi.
The Movidius hardware is useful if you want faster inference performance on something like a Raspberry Pi.
So what has Microsoft bought to the table that is useful and novel?
Maybe it’s just an educational tool? Maybe it can create filters like Kalman for combining data from different sources? There’s got to be some reason why this thing was created.
Are there efficient ways to run a pre trained network a microcontroller? From my point of view the network could be represented as arrays and that shouldn’t be too complicated to transfer to a microcontroller?
Efficient algorithm will run faster on Arduino, than inefficient one on Intel multicore.
Did anyone really try it?
There’s a chip called Akida to be released in 2020 that may be excellent for this. It consumes miniscule power. A software simulation has just been released to everyone. https://www.brainchipinc.com/products/akida-development-environment
https://www.brainchipinc.com/technology
People may be interested in
https://www.brainchipinc.com/products/akida-development-environment
https://www.brainchipinc.com/technology
the development environment is now freely available to play with