Jetson Nano Robot

[Stevej52] likes to build things you can’t buy, and this Jetson Nano robot falls well within that category. Reading the project details, you might think [Stevej52] drinks too much coffee. But we think he is just excited to have successfully pulled off the Herculean task of integrating over a dozen hardware and software modules. Very briefly, he is running Ubuntu and ROS on the PC and Nano. It is all tied together with Python code, and is using Modbus over IP to solve a problem getting joystick data to the Nano. We like it when existing, standard protocols can be used because it frees the designer to focus more on the application. Modbus has been around for 40 years, has widespread support in many languages and platforms.

This is an ongoing project, and we look forward to seeing more updates and especially more video of it in action like the one found below. With the recent release of a price-reduced Jetson Nano, which we covered last week, this might be an excellent project to take on.

Continue reading “Jetson Nano Robot”

NVIDIA Announces $59 Jetson Nano 2GB, A Single Board Computer With Makers In Mind

NVIDIA kicked off their line of GPU-accelerated single board computers back in 2014 with the Jetson TK1, a $200 USD development system for those looking to get involved with the burgeoning world of so-called “edge computing”. It was designed to put high performance computing in a small and energy efficient enough package that it could be integrated directly into products, rather than connecting to a data center half-way across the world.

The TK1 was an impressive piece of hardware, but not something the hacker and maker community was necessarily interested in. For one thing, it was fairly expensive. But perhaps more importantly, it was clearly geared more towards industry types than consumers. We did see the occasional project using the TK1 and the subsequent TX1 and TX2 boards, but they were few and far between.

Then came the Jetson Nano. Its 128 core Maxwell CPU still packed plenty of power and was fully compatible with NVIDIA’s CUDA architecture, but its smaller size and $99 price tag made it far more attractive for hobbyists. According to the company’s own figures, the number of active Jetson developers has more than tripled since the Nano’s introduction in March of 2019. With the platform accessible to a larger and more diverse group of users, new and innovative applications for machine learning started pouring in.

Cutting the price of the entry level Jetson hardware in half was clearly a step in the right direction, but NVIDIA wanted to bring even more developers into the fray. So why not see if lightning can strike twice? Today they’ve officially announced that the new Jetson Nano 2GB will go on sale later this month for just $59. Let’s take a close look at this new iteration of the Nano to see what’s changed (and what hasn’t) from last year’s model.

Continue reading “NVIDIA Announces $59 Jetson Nano 2GB, A Single Board Computer With Makers In Mind”

Identifying Creatures That Go Chirp In The Night

It’s common knowledge that bats navigate and search for their prey using echolocation, but did you know that the ultrasonic chips made by different species of bats are distinct enough that they can be used for identification? [Tegwyn☠Twmffat] did, which is why he came up with this impressive device capable of cataloging the different bats flying around at night.

Now this might seem like an odd gadget to have, but if you’re in the business of wildlife conservation, it’s not hard to imagine how this sort of capability might be useful. This device could be used to easily estimate the size and diversity of bat populations in a particular area. [Tegwyn☠Twmffat] also mentions that, at least in theory, the core concept should work with other types of noisy critters like rodents or dolphins.

Powered by the NVIDIA Jetson Nano, the unit listens with a high-end ultrasonic microphone for the telltale chirps of bats. These are then processed by the software and compared to a database of samples that [Tegwyn☠Twmffat] personally collected in local nature reserves. In the video after the break, you can also see how he uses a set of house keys jingling as a control to make sure the system is running properly.

As winner of the Train All the Things contest back in April, we’re eager to see how the Intelligent Wildlife Species Detector will fare as the competition heats up in the 2020 Hackaday Prize.

Continue reading “Identifying Creatures That Go Chirp In The Night”

Checking In On The Damn Linux Tablet One

Tablets, slates, phones, and fablets, there are no shortage of electronics that take the Star-Trek-ish form factor of a handheld rectangle of glass that connects you to everything. This is the world we live in, but unfortunately it’s not currently a world with many Linux options, and certainly not one that includes modular design concepts. This is what motivated [Timon] to design the Damn Linux Table one, a “Proper Linux Tablet” built around the Nvidia Jetson Nano board.

The design really took off, because who isn’t interested in the ability to upgrade and customize a tablet? During last year’s Hackaday Supercon we caught up with [Timon] for an interview the morning after he won the Best Design prize for DLT one. Check out that video below, then join us after the break for an update on the latest from the project.

There’s only one week left to get your project entered in the 2020 Hackaday Prize. We won’t know this year’s winners until the Hackaday Remoticon rolls around this November. The Call for Proposals for that virtual conference is still open!

[Timon] is realistic about the limits of modular design. He readily admits you’re not going to upgrade a graphics card on a mobile device, but when it comes to the peripherals, why not? You might want to choose between micro-USB, USB-C, barrel-jack, or do something completely custom. One hacker’s NFC equipment might be replaced by another’s SDR or LoRa. This tablet design sees a world where connecting PCIe components to your mobile devices is completely doable. The point is to make a base model that works great, but has the potential to be what each different user wants their device to be.

Continue reading “Checking In On The Damn Linux Tablet One”

Machine Learning Takes The Embarrassment Out Of Videoconference Wardrobe Malfunctions

Telecommuters: tired of the constant embarrassment of showing up to video conferences wearing nothing but your underwear? Save the humiliation and all those pesky trips down to HR with Safe Meeting, the new system that uses the power of artificial intelligence to turn off your camera if you forget that casual Friday isn’t supposed to be that casual.

The following infomercial is brought to you by [Nick Bild], who says the whole thing is tongue-in-cheek but we sense a certain degree of “necessity is the mother of invention” here. It’s true that the sudden throng of remote-work newbies certainly increases the chance of videoconference mishaps and the resulting mortification, so whatever the impetus, Safe Meeting seems like a great idea. It uses a Pi cam connected to a Jetson Nano to capture images of you during videoconferences, which are conducted over another camera. The stream is classified by a convolutional neural net (CNN) that determines whether it can see your underwear. If it can, it makes a REST API call to the conferencing app to turn off the camera. The video below shows it in action, and that it douses the camera quickly enough to spare your modesty.

We shudder to think about how [Nick] developed an underwear-specific training set, but we applaud him for doing so and coming up with a neat application for machine learning. He’s been doing some fun work in this space lately, from monitoring where surfaces have been touched to a 6502-based gesture recognition system.

Continue reading “Machine Learning Takes The Embarrassment Out Of Videoconference Wardrobe Malfunctions”

Robotic Skin Sees When (and How) You’re Touching It

Cameras are getting less and less conspicuous. Now they’re hiding under the skin of robots.

A team of researchers from ETH Zurich in Switzerland have recently created a multi-camera optical tactile sensor that is able to monitor the space around it based on contact force distribution. The sensor uses a stack up involving a camera, LEDs, and three layers of silicone to optically detect any disturbance of the skin.

The scheme is modular and in this example uses four cameras but can be scaled up from there. During manufacture, the camera and LED circuit boards are placed and a layer of firm silicone is poured to about 5 mm in thickness. Next a 2 mm layer doped with spherical particles is poured before the final 1.5 mm layer of black silicone is poured. The cameras track the particles as they move and use the information to infer the deformation of the material and the force applied to it. The sensor is also able to reconstruct the forces causing the deformation and create a contact force distribution. The demo uses fairly inexpensive cameras — Raspberry Pi cameras monitored by an NVIDIA Jetson Nano Developer Kit — that in total provide about 65,000 pixels of resolution.

Apart from just providing more information about the forces applied to a surface, the sensor also has a larger contact surface and is thinner than other camera-based systems since it doesn’t require the use of reflective components. It regularly recalibrates itself based on a convolutional neural network pre-trained with data from three cameras and updated with data from all four cameras. Possible future applications include soft robotics, improving touch-based sensing with the aid of computer vision algorithms.

While self-aware robotic skins may not be on the market quite so soon, this certainly opens the possibility for robots that can detect when too much force is being applied to their structures — the machine equivalent sensation to pain.

Continue reading “Robotic Skin Sees When (and How) You’re Touching It”

Name Stone Helps You Greet Coworkers

When starting a new job, learning coworkers names can be a daunting task. Getting this right is key to forming strong professional relationships. [Ahad] noted that [Marcos] was struggling with this, so built the Name Stone to help.

The Name Stone consists of some powerful hardware, wrapped up in a 3D printed case reminiscent of the Eye of Agamotto from Doctor Strange. Inside, there’s a Jetson Nano – an excellent platform for any project built around machine learning tasks. This is combined with a microphone and camera to collect data from the environment.

[Ahad] then went about training neural networks to help with basic identification tasks. Video was taken of the coworkers, then the frames used to train a convolutional neural network using PyTorch. Similarly, a series of audio clips were used to again train a network to identify individuals through the sound of their voice, using MFCC techniques. Upon activating the stone, the device will capture an image or a short sound clip, and process the data to identify the target coworker and remind [Marcos] of their name.

It’s a project that could be quite useful, given to new employees to help them transition into the new workplace. Of course, pervasive facial recognition technology does have some drawbacks. Video after the break.

Continue reading “Name Stone Helps You Greet Coworkers”