But Can Your AI Recognize Slugs?

The common garden slug is a mystery. Observing these creatures as they slowly emerge from their slimy lairs each evening, it’s hard to imagine how much damage they can do. With paradoxical speed, they can mow down row after row of tender seedlings, leaving nothing but misery in their mucusy wake.

To combat this slug menace, [Tegwyn☠Twmffat] (the [☠] is silent) is developing this AI-powered slug busting system. The squeamish or those challenged by the ethics of slug eradication can relax: no slugs have been harmed yet. So far [Tegwyn] has concentrated on the detection of slugs, a considerably non-trivial problem since there are few AI models that are already trained for slugs.

So far, [Tegwyn] has acquired 5,712 images of slugs in their natural environment – no mean feat as they only come out at night, they blend into their background, and their slimy surface makes for challenging reflections. The video below shows moderate success of the trained model using a static image of a slug; it also gives a glimpse at the hardware used, which includes an Nvidia Jetson TX2. [Tegwyn] plans to capture even more images to refine the model and boost it up from the 50 to 60% confidence level to something that will allow for the remediation phase of the project, which apparently involves lasers. Although he’s willing to entertain other methods of disposal; perhaps a salt-shooting turret gun?

This isn’t the first garden-tending project [Tegwyn] has tackled. You may recall The Weedinator, his 2018 Hackaday Prize entry. This slug buster is one of his entries for the 2019 Hackaday Prize, which was just announced. We’re looking forward to seeing the onslaught of cool new projects everyone will be coming up with.

Continue reading “But Can Your AI Recognize Slugs?”

Hands-On: New Nvidia Jetson Nano Is More Power In A Smaller Form Factor

Today, Nvidia released their next generation of small but powerful modules for embedded AI. It’s the Nvidia Jetson Nano, and it’s smaller, cheaper, and more maker-friendly than anything they’ve put out before.

The Jetson Nano follows the Jetson TX1, the TX2, and the Jetson AGX Xavier, all very capable platforms, but just out of reach in both physical size, price, and the cost of implementation for many product designers and nearly all hobbyist embedded enthusiasts.

The Nvidia Jetson Nano Developers Kit clocks in at $99 USD, available right now, while the production ready module will be available in June for $129. It’s the size of a stick of laptop RAM, and it only needs five Watts. Let’s take a closer look with a hands-on review of the hardware.

Continue reading “Hands-On: New Nvidia Jetson Nano Is More Power In A Smaller Form Factor”

Uncertain Future Of Orphaned Jibo Robots Presents Opportunities

In our modern connected age, our devices have become far more powerful and useful when they could draw upon resources of a global data network. The downside of a cloud-connected device is the risk of being over-reliant on computers outside of our own control. The people who brought a Jibo into their home got a stark reminder of this fact when some (but not all) Jibo robots gave their owners a farewell message as their servers are shut down, leaving behind little more than a piece of desktop sculpture.

Jibo launched their Indiegogo crowdfunding campaign with the tagline “The World’s First Social Robot For The Home.” Full of promises of how Jibo will be an intelligent addition to a high tech household, it has always struggled to justify its price tag. It cost as much as a high end robot vacuum, but without the house cleaning utility. Many demonstrations of a Jibo’s capabilities centered around its voice control, which an Amazon Echo or Google Home could match at a fraction of the price.

By the end of 2018, all assets and intellectual property have been sold to SQN Venture Partners. They have said little about what they planned to do with their acquisition. Some Jibo owner still hold hope that there’s still a bright future ahead. Both on the official forums (for however long that will stay running) and on unofficial channels like Reddit. Other owners have given up and unplugged their participation in this social home robotics experiment.

If you see one of these orphans in your local thrift store for a few bucks, consider adopting it. You could join the group hoping for something interesting down the line, but you’re probably more interested in its hacking potential: there is a Nvidia Jetson inside good for running neural networks. Probably a Tegra K1 variant, because Jibo used the Jetson TK1 to develop the robot before launch. Jibo has always promised a developer SDK for the rest of us to extend Jibo’s capabilities, but it never really materialized. The inactive Github repo mainly consists of code talking to servers that are now offline, not much dealing directly with the hardware.

Jibo claimed thousands were sold and, if they start becoming widely available inexpensively, we look forward to a community working to give new purpose to these poor abandoned robots. If you know of anyone who has done a teardown to see exactly what’s inside, or if someone has examined upgrade files to create custom Jibo firmware, feel free to put a link in the comments and help keep these robots out of e-waste.

If you want to experiment with power efficient neural network accelerators but rather work with an officially supported development platform, we’ve looked at the Jetson TK1 successors TX1 and TX2. And more recently, Google has launched one of their own, as has our friends at Beaglebone.

NVIDIA’s A.I. Thinks It Knows What Games Are Supposed Look Like

Videogames have always existed in a weird place between high art and cutting-edge technology. Their consumer-facing nature has always forced them to be both eye-catching and affordable, while remaining tasteful enough to sit on retail shelves (both physical and digital). Running in real-time is a necessity, so it’s not as if game creators are able to pre-render the incredibly complex visuals found in feature films. These pieces of software constantly ride the line between exploiting the hardware of the future while supporting the past where their true user base resides. Each pixel formed and every polygon assembled comes at the cost of a finite supply of floating point operations today’s pieces of silicon can deliver. Compromises must be made.

Often one of the first areas in games that fall victim to compromise are environmental model textures. Maintaining a viable framerate is paramount to a game’s playability, and elements of the background can end up getting pushed to “the background”. The resulting look of these environments is somewhat more blurry than what they would have otherwise been if artists were given more time, or more computing resources, to optimize their creations. But what if you could update that ten-year-old game to take advantage of today’s processing capabilities and screen resolutions?

NVIDIA is currently using artificial intelligence to revise textures in many classic videogames to bring them up to spec with today’s monitors. Their neural network is able fundamentally alter how a game looks without any human intervention. Is this a good thing?

Continue reading “NVIDIA’s A.I. Thinks It Knows What Games Are Supposed Look Like”

Nvidia Transforms Standard Video Into Slow Motion Using AI

Nvidia is back at it again with another awesome demo of applied machine learning: artificially transforming standard video into slow motion – they’re so good at showing off what AI can do that anyone would think they were trying to sell hardware for it.

Though most modern phones and cameras have an option to record in slow motion, it often comes at the expense of resolution, and always at the expense of storage space. For really high frame rates you’ll need a specialist camera, and you often don’t know that you should be filming in slow motion until after an event has occurred. Wouldn’t it be nice if we could just convert standard video to slow motion after it was recorded?

That’s just what Nvidia has done, all nicely documented in a paper. At its heart, the algorithm must take two frames, and artificially create one or more frames in between. This is not a manual algorithm that interpolates frames, this is a fully fledged deep-learning system. The Convolutional Neural Network (CNN) was trained on over a thousand videos – roughly 300k individual frames.

Since none of the parameters of the CNN are time-dependent, it’s possible to generate as many intermediate frames as required, something which sets this solution apart from previous approaches.  In some of the shots in their demo video, 30fps video is converted to 240fps; this requires the creation of 7 additional frames for every pair of consecutive frames.

The video after the break is seriously impressive, though if you look carefully you can see the odd imperfection, like the hockey player’s skate or dancer’s arm. Deep learning is as much an art as a science, and if you understood all of the research paper then you’re doing pretty darn well. For the rest of us, get up to speed by wrapping your head around neural networks, and trying out the simplest Tensorflow example.

Continue reading “Nvidia Transforms Standard Video Into Slow Motion Using AI”

CUDA Is Like Owning A Supercomputer

The word supercomputer gets thrown around quite a bit. The original Cray-1, for example, operated at about 150 MIPS and had about eight megabytes of memory. A modern Intel i7 CPU can hit almost 250,000 MIPS and is unlikely to have less than eight gigabytes of memory, and probably has quite a bit more. Sure, MIPS isn’t a great performance number, but clearly, a top-end PC is way more powerful than the old Cray. The problem is, it’s never enough.

Today’s computers have to processes huge numbers of pixels, video data, audio data, neural networks, and long key encryption. Because of this, video cards have become what in the old days would have been called vector processors. That is, they are optimized to do operations on multiple data items in parallel. There are a few standards for using the video card processing for computation and today I’m going to show you how simple it is to use CUDA — the NVIDIA proprietary library for this task. You can also use OpenCL which works with many different kinds of hardware, but I’ll show you that it is a bit more verbose.
Continue reading “CUDA Is Like Owning A Supercomputer”

Hackaday Links Column Banner

Hackaday Links: Not A Creature Was Stirring, Except For A Trackball

Hey, did you know Hackaday is starting an Open Access, peer-reviewed journal? The Hackaday Journal of What You Don’t Know (HJWYDK) is looking for submissions detailing the tools, techniques, and skills that we don’t know, but should. Want to teach everyone how to make sand think? Write a paper and tell us about it! Send in your submissions here.

Have you noticed OSH Park updated their website?

The MSP430 line of microcontrollers are super cool, low power, and cheap. Occasionally, TI pumps out a few MSP430 dev boards and sells them for the rock-bottom price of $4.30. Here ya go, fam. This one is loaded up with the MSP430FR2433.

lol, Bitcoin this week.

Noisebridge, the San Francisco hackerspace and one of the first hackerspaces in the US, is now looking for a new place. Why, you may ask? Because San Francisco real estate. The current price per square foot is triple what their current lease provides. While we hope Noisebridge will find a new home, we’re really looking forward to the hipster restaurant that’s only open for brunch that will take its place.

The coolest soundcards, filled with DOS blips and bloops, were based on the OPL2 and OPL3 sound chips. If you want one of these things, you’re probably going to be digging up an old ISA SoundBlaster soundcard. The OPL2LPT is the classic sound card for computers that don’t have an easily-accessible ISA bus, like those cool vintage laptops. The 8-Bit Guy recently took a look at this at this neat piece of hardware, and apart from requiring a driver to work with any OPL2-compatible game, this thing actually works.

NVIDIA just did something amazing. They created a piece of hardware that everyone wants but isn’t used to turn electricity into heat and Bitcoin. This fantastic device, that is completely original and not at all derivative, is sold in the NVIDIA company store for under five dollars. Actually, the green logo silk/art on this PCB ruler is kinda cool, and I’d like to know how they did that. Also, and completely unrelated: does anyone want ten pounds of Digikey PCB rulers?