Every appliance business wants to be the one that invents the patented, license-able, and profitable standard that all the other companies have to use. Open Source Kitchen wants to beat them to it.
Every beginning standard needs a test case, and OSK’s is a simple one. A bowl that tracks what you eat. While a simple concept, the way in which the data is shared, tracked, logged, and communicated is the real goal.
The current demo uses a Nvidia Jetson Nano as its processing center. This $100 US board packs a bit of a punch in its weight class. It processes the video from a camera held above the bowl of fruit, suspended by a scale in a squirrel shaped hangar, determining the calories in and calories out.
It’s an interesting idea. One wonders how the IoT boom might have played out if there had been a widespread standard ready to go before people started walling their gardens.
At first sight, [Kyle]’s Elroy lamp is simply an attractive piece of modern-styled interior furnishing; its clean lines, wood grain, and contemporary patterning being an asset to the room. But when he pulls out his phone, things change. Because this lamp hides a secret: at its heart may be a standard LED bulb, but the shade conceals four LCD screens driven by an Nvidia Jetson. These can be controlled through a web app to display a variety of textures, completing the effect.
This is not however simply a set of laptop screens bolted to a lampshade. The screens started life in laptops sure enough, but have since had their reflective backing removed to create a transparent LCD panel. Then an appropriate diffuser had to be found, which after much experimentation became a composite including more than one textured paper. Finally the whole was enclosed in an attractive wooden lamp frame and became part of the furniture. We like it, both as an aesthetically pleasing lamp and as a genuine departure from the norm.
Thanks to the wonders of neural networks and machine learning algorithms, it’s now possible to do things that were once thought to be inordinately difficult to achieve with computers. It’s a combination of the right techniques and piles of computing power that make such feats doable, and [Robert Bond’s] ant zapping project is a great example.
The project is based around an NVIDIA Jetson TK1, a system that brings the processing power of a modern GPU to an embedded platform. It’s fitted with a USB camera, that is used to scan its field of view for ants. Once detected, thanks to a little OpenCV magic, the coordinates of the insect are passed to the laser system. Twin stepper motors are used to spin mirrors that direct the light from a 5 mW red laser, which is shined on the target. If you’re thinking of working on something like this we highly recommend using galvos to direct the laser.
Such a system could readily vaporize ants if fitted with a more powerful laser, but [Robert] decided to avoid this for safety reasons. Plus, the smell wouldn’t be great, and nobody wants charred insect residue all over the kitchen floor anyway. We’ve seen AIs do similar work, too – like detecting naughty cats for security reasons.
Telepresence is one of those futuristic buzzwords that’s popped up a few times over the decades; promising the ability to attend a meeting in New York City and another in Tokyo an hour later, all without having to leave the comfort of your home or office. This is the premise of Double Robotics’ Double 3, its most recent entry in this market segment, as the commercial counterpoint to more DIY offerings.
More than just a glorified tablet screen.
Looking like a tablet perched on top of a Segway, the built-in dual 13 megapixel cameras allow the controller to get a good look at their surroundings, while the 6 beamforming microphones should theoretically allow one to pick up any conversation in a meeting or on the work floor.
Battery life is limited to 4 hours, and it takes 2 hours to recharge the built-in battery. Fortunately one can just hop over to another, freshly charged Double 3 if the battery runs out. Assuming the $3,999 price tag doesn’t get in the way of building up a fleet of them, anyway.
Probably the most interesting aspect of the product is its self-driving feature, which has resulted in a whole range of sensors and cameras (Intel RealSense D430 stereo vision depth sensors) being installed. To handle the processing of this sensor data, the system is equipped with an NVidia Jetson TX2 ARM board, running Ubuntu Linux, which also renders the mixed-reality UI for the user with way points and other information.
Currently Double Robotics accepts sign-ups for the private beta of the Double 3 API, which would give developers access to the sensor data and various autonomous features of Double 3’s hardware. Co-founder of Double Robotics, [Marc DeVidts] stated to Hackaday that he is looking forward to seeing what people can build with it. Hopefully this time people will not simply take the thing for a joyride, like what happened with a predecessor of the Double 3.
Found yourself with a shiny new NVIDIA Jetson Nano but tired of having it slide around your desk whenever cables get yanked? You need a stand! If only there was a convenient repository of options that anyone could print out to attach this hefty single-board computer to nearly anything. But wait, there is! [Madeline Gannon]’s accurately named jetson-nano-accessories repository supports a wider range of mounting options that you might expect, with modular interconnect-ability to boot!
A device like the Jetson Nano is a pretty incredible little System On Module (SOM), more so when you consider that it can be powered by a boring USB battery. Mounted to NVIDIA’s default carrier board the entire assembly is quite a bit bigger than something like a Raspberry Pi. With a huge amount of computing power and an obvious proclivity for real-time computer vision, the Nano is a device that wants to go out into the world! Enter these accessories.
At their core is an easily printable slot-and-tab modular interlock system which facilitates a wide range of attachments. Some bolt the carrier board to a backplate (like the gardening spike). Others incorporate clips to hold everything together and hang onto a battery and bicycle. And yes, there are boring mounts for desks, tripods, and more. Have we mentioned we love good documentation? Click into any of the mount types to find more detailed descriptions, assembly directions, and even dimensioned drawings. This is a seriously professional collection of useful kit.
We live in an exciting time of machine intelligence. Over the past few months, several products have been launched offering neural network processors at a price within hobbyist reach. But as exciting as the hardware might be, they still need software to be useful. Nvidia was not content to rest on their impressive Jetson hardware and has created a software framework to accelerate building robots around them. Anyone willing to create a Nvidia developer account may now play with the Isaac Robot Engine framework.
Isaac initially launched about a year ago as part of a bundle with Jetson Xavier hardware. But the $1,299 developer kit price tag pushed it out of reach for many of us. Now we can buy a Jetson Nano for about a hundred bucks. For those familiar with Robot Operating System (ROS), Isaac will look very familiar. They both aim to make robotic software as easy as connecting common modules together. Many of these modules called GEMS in Isaac were tailored to the strengths of Nvidia Jetson hardware. In addition to those modules and ways for them to work together, Isaac also includes a simulator for testing robot code in a virtual world similar to Gazebo for ROS.
While Isaac can run on any robot with an Nvidia Jetson brain, there are two reference robot designs. Carter is the more expensive and powerful commercially built machine rolling on Segway motors, LIDAR environmental sensors, and a Jetson Xavier. More interesting to us is the Kaya (pictured), a 3D-printed DIY robot rolling on Dynamixel serial bus servos. Kaya senses the environment with an Intel RealSense D435 depth camera and has Jetson Nano for a brain. Taken together the hardware and software offerings are a capable and functional package for exploring intelligent autonomous robots.
It is somewhat disappointing Nvidia decided to create their own proprietary software framework reinventing many wheels, instead of contributing to ROS. While there are some very appealing features like WebSight (a browser-based inspect and debug tool) at first glance Isaac doesn’t seem fundamentally different from ROS. The open source community has already started creating ROS nodes for Jetson hardware, but people who work exclusively in the Nvidia ecosystem or face a time-to-market deadline would appreciate having the option of a pre-packaged solution like Isaac.
Today, Nvidia released their next generation of small but powerful modules for embedded AI. It’s the Nvidia Jetson Nano, and it’s smaller, cheaper, and more maker-friendly than anything they’ve put out before.
The Jetson Nano follows the Jetson TX1, the TX2, and the Jetson AGX Xavier, all very capable platforms, but just out of reach in both physical size, price, and the cost of implementation for many product designers and nearly all hobbyist embedded enthusiasts.
The Nvidia Jetson Nano Developers Kit clocks in at $99 USD, available right now, while the production ready module will be available in June for $129. It’s the size of a stick of laptop RAM, and it only needs five Watts. Let’s take a closer look with a hands-on review of the hardware.