I’ve always been fascinated by AI and machine learning. Google TensorFlow offers tutorials and has been on my ‘to-learn’ list since it was first released, although I always seem to neglect it in favor of the shiniest new embedded platform.
Last July, I took note when Intel released the Neural Compute Stick. It looked like an oversized USB stick, and acted as an accelerator for local AI applications, especially machine vision. I thought it was a pretty neat idea: it allowed me to test out AI applications on embedded systems at a power cost of about 1W. It requires pre-trained models, but there are enough of them available now to do some interesting things.
I wasn’t convinced I would get great performance out of it, and forgot about it until last November when they released an improved version. Unambiguously named the ‘Neural Compute Stick 2’ (NCS2), it was reasonably priced and promised a 6-8x performance increase over the last model, so I decided to give it a try to see how well it worked.
I took a few days off work around Christmas to set up Intel’s OpenVino Toolkit on my laptop. The installation script provided by Intel wasn’t particularly user-friendly, but it worked well enough and included several example applications I could use to test performance. I found that face detection was possible with my webcam in near real-time (something like 19 FPS), and pose detection at about 3 FPS. So in accordance with the holiday spirit, it knows when I am sleeping, and knows when I’m awake.
That was promising, but the NCS2 was marketed as allowing AI processing on edge computing devices. I set about installing it on the Raspberry Pi 3 Model B+ and compiling the application samples to see if it worked better than previous methods. This turned out to be more difficult than I expected, and the main goal of this article is to share the process I followed and save some of you a little frustration.
Even though it might appear to be pretend Internet money, by design, there are a finite number of Bitcoins available. In the same way that the limited amount of gold on the planet and the effort required to extract it from the ground keeps prices high, the scarcity of Bitcoin is intended to make sure it remains valuable. As of right now, over 80% of all the Bitcoins that will ever exist have already been put into circulation. That sounds like a lot, but it’s expected to take another 100+ years to free up the remaining ones, so we’ve still got a way to go.
On the hardware side, this is a pretty simple project. The enclosure is laser cut 5 mm MDF, and it holds a Raspberry Pi 3, a MAX7219 32×8 LED dot matrix display, and a 10 mm white LED with accompanying resistor. The white LED is placed behind an acrylic diffuser to give the Bitcoin logo on the side of the display a soft pleasing glow when the device is powered up. There are no buttons or other controls on the ticker, once the software has been configured it just gets plugged in and away it goes.
As for the software, it takes the form of a Python script [Jonty] has created which uses Requests and Beautiful Soup to scrape the relevant data from bitcoinblockhalf.com. The script supports pulling any of the 19 variables listed on the site and displaying it on the LED matrix, which range from the truly nerdy stats like daily block generation to legitimately useful data points that anyone with some Bitcoin in their digital wallets might like to have ticking away on their desks.
Vantablack is the darkest pigment ever created, capable of absorbing 99.96% of visible light. If you cover something in Vantablack, it turns into a black hole. No detail is presented, and physical objects become silhouettes. Objects covered in Vantablack are outside the human experience. The mammalian mind cannot comprehend a Vantablack object.
Vantablack is cool, but it’s also expensive. It’s also exclusively licensed by [Anish Kapoor]’s studio for artistic use. Understandably, artists have rebelled, and they’re making their own Vantablack-like pigments. Now, the World’s Blackest Black is on Kickstarter. You can get a 150 ml bottle of Black 3.0, something that’s almost black as Vantablack, for £10.
The pigment for Black 3.0 is called Black Magick, and yes, there was a version 2.0 The problem with the earlier version is that although the pigment was blacker than almost anything else, paint isn’t just pigment. You need binders. The new formulation uses a new acrylic polymer to hold the pigment, and ‘nano-mattifiers’ to make the paint none more matte.
What can you do with the blackest black paint you’ve ever seen? Well, taking pictures of an object covered in the blackest black is a tiny bit dumb. This is something that must be experienced in person. You could paint a car with it, which is something I really want to see. You could follow [Anish Kapoor] around in the shadows. Use it as a calibration target. Who knows what we’ll do with the almost-Vantablack when everyone has it.