Open Source Kitchen Helps You Watch What You Eat

Every appliance business wants to be the one that invents the patented, license-able, and profitable standard that all the other companies have to use. Open Source Kitchen wants to beat them to it. 

Every beginning standard needs a test case, and OSK’s is a simple one. A bowl that tracks what you eat. While a simple concept, the way in which the data is shared, tracked, logged, and communicated is the real goal.

The current demo uses a Nvidia Jetson Nano as its processing center. This $100 US board packs a bit of a punch in its weight class. It processes the video from a camera held above the bowl of fruit, suspended by a scale in a squirrel shaped hangar, determining the calories in and calories out.

It’s an interesting idea. One wonders how the IoT boom might have played out if there had been a widespread standard ready to go before people started walling their gardens.

STEP Up Your Jetson Nano Game With These Printable Accessories

Found yourself with a shiny new NVIDIA Jetson Nano but tired of having it slide around your desk whenever cables get yanked? You need a stand! If only there was a convenient repository of options that anyone could print out to attach this hefty single-board computer to nearly anything. But wait, there is! [Madeline Gannon]’s accurately named jetson-nano-accessories repository supports a wider range of mounting options that you might expect, with modular interconnect-ability to boot!

A device like the Jetson Nano is a pretty incredible little System On Module (SOM), more so when you consider that it can be powered by a boring USB battery. Mounted to NVIDIA’s default carrier board the entire assembly is quite a bit bigger than something like a Raspberry Pi. With a huge amount of computing power and an obvious proclivity for real-time computer vision, the Nano is a device that wants to go out into the world! Enter these accessories.

At their core is an easily printable slot-and-tab modular interlock system which facilitates a wide range of attachments. Some bolt the carrier board to a backplate (like the gardening spike). Others incorporate clips to hold everything together and hang onto a battery and bicycle. And yes, there are boring mounts for desks, tripods, and more. Have we mentioned we love good documentation? Click into any of the mount types to find more detailed descriptions, assembly directions, and even dimensioned drawings. This is a seriously professional collection of useful kit.

Fake Graphics Cards And How To Fix Them

When shopping online, there’s plenty of great deals out there on modern graphics hardware. Of course, if you’re like [Dawid] and bought a GTX1050 Ti for $48 from Wish, you probably suspect it’s too good to be true. Of course, you’d be correct.

[Dawid] notes from the outset that the packaging the card ships in is unusual. While it’s covered in NVIDIA and GeForce branding, there’s no note of the model number or even the overarching series. The card is loosely packed in bubblewrap, free to bounce around in transit. Upon installation, the card reports itself as a GTX1050 Ti, but refuses to properly work with NVIDIA drivers and routinely causes a Blue Screen of Death.

Upon disassembly, it becomes apparent that the card is merely a poorly manufactured GTS450 Revision 2, over five generations older than the card it was advertised as. Thanks to the mismatch between the actual hardware and what the card reports as, the drivers are unable to properly work with the card.

For those that have been scammed, there is some hope. [Phil] has had experience with several of these cards, which similarly misreport their actual hardware. To correct this, the cards need to have their BIOS flashed to reflect reality, but the fake cards don’t work with NVIDIA’s NVFlash tool. Instead, they must be flashed manually using an EEPROM programmer. Once the cards are flashed with an appropriate BIOS, they can be used with the proper drivers and will function properly, albeit with much less performance than was advertised.

It’s an interesting insight into the state of online shopping platforms, and the old adage remains true – if it’s too good to be true, it probably is. Plus, hacking GPUs can often have great results. Video after the break.

Continue reading “Fake Graphics Cards And How To Fix Them”

This Week In Security: Nvidia, Ransomware Retirement, And A TOCTOU Bug In Docker

Nvidia’s GeForce Experience (GFE) is the companion application for the Nvidia drivers, keeping said drivers up to date, as well as adding features around live streaming and media capture. The application runs as two parts, a GUI, and a system service, using an HTTP API to communicate. [David Yesland] from Rhino Security Labs decided to look into this API, searching for interesting, undocumented behavior, and shared the results on Sunday the 2nd.

The first interesting finding was that the service was written in Javascript and run using Node.js. Javascript is a scripting language, not a compiled language — the source code of the service was open for studying. This led to the revelation that API requests would be accepted from any origin, so long as the request included the proper security token. The application includes an update mechanism, which allows an authorized API call to execute an arbitrary system command. So long as the authentication token isn’t leaked to an attacker, this still isn’t a problem, right? Continue reading “This Week In Security: Nvidia, Ransomware Retirement, And A TOCTOU Bug In Docker”

Nvidia Jetson Robots Get A Head Start With Isaac Software Tools

We live in an exciting time of machine intelligence. Over the past few months, several products have been launched offering neural network processors at a price within hobbyist reach. But as exciting as the hardware might be, they still need software to be useful. Nvidia was not content to rest on their impressive Jetson hardware and has created a software framework to accelerate building robots around them. Anyone willing to create a Nvidia developer account may now play with the Isaac Robot Engine framework.

Isaac initially launched about a year ago as part of a bundle with Jetson Xavier hardware. But the $1,299 developer kit price tag pushed it out of reach for many of us. Now we can buy a Jetson Nano for about a hundred bucks. For those familiar with Robot Operating System (ROS), Isaac will look very familiar. They both aim to make robotic software as easy as connecting common modules together. Many of these modules called GEMS in Isaac were tailored to the strengths of Nvidia Jetson hardware. In addition to those modules and ways for them to work together, Isaac also includes a simulator for testing robot code in a virtual world similar to Gazebo for ROS.

While Isaac can run on any robot with an Nvidia Jetson brain, there are two reference robot designs. Carter is the more expensive and powerful commercially built machine rolling on Segway motors, LIDAR environmental sensors, and a Jetson Xavier. More interesting to us is the Kaya (pictured), a 3D-printed DIY robot rolling on Dynamixel serial bus servos. Kaya senses the environment with an Intel RealSense D435 depth camera and has Jetson Nano for a brain. Taken together the hardware and software offerings are a capable and functional package for exploring intelligent autonomous robots.

It is somewhat disappointing Nvidia decided to create their own proprietary software framework reinventing many wheels, instead of contributing to ROS. While there are some very appealing features like WebSight (a browser-based inspect and debug tool) at first glance Isaac doesn’t seem fundamentally different from ROS. The open source community has already started creating ROS nodes for Jetson hardware, but people who work exclusively in the Nvidia ecosystem or face a time-to-market deadline would appreciate having the option of a pre-packaged solution like Isaac.

Nvidia Teaching Robots To Master IKEA Kitchens

The current wave of excitement around machine learning kicked off when graphics processors were repurposed to make training deep neural networks practical. Nvidia found themselves the engine of a new revolution and seized their opportunity to help push frontiers of research. Their research lab in Seattle will focus on one such field: making robots smart enough to work alongside humans in an IKEA kitchen.

Today’s robots are mostly industrial machines that require workspaces designed for robots. They run day and night, performing repetitive tasks, usually inside cages to keep squishy humans out of harm’s way. Robots will need to be a lot smarter about their surroundings before we could safely dismantle those cages. While there are some industrial robots making a start in this arena, they have a hard time justifying their price premium. (Example: financial difficulty of Rethink Robotics, who made the Baxter and Sawyer robots.)

So there’s a lot of room for improvement in this field, and this evolution will need a training environment offering tasks of varying difficulty levels for robots. Anywhere from the rigorous structured environment where robots work well today, to a dynamic unstructured environment where robots are hopelessly lost. Lab lead Dr. Dieter Fox explained how a kitchen is ideal. A meticulously cleaned and organized kitchen is very similar to an industrial setting. From there, we can gradually make a kitchen more challenging for a robot. For example: today’s robots can easily pick up a can with its rigid regular shape, but what about a half-full bag of flour? And from there, learn to pick up a piece of fresh fruit without bruising it. These tasks share challenges with many other tasks outside of a kitchen.

This isn’t about building a must-have home cooking robot, it’s about working through the range of challenges shared with common kitchen tasks. The lab has a lot of neat hardware, but its success will be measured by the software, and like all research, published results should be reproducible by other labs. You don’t have a high-end robotics lab in your house, but you do have a kitchen. That’s why it’s not just any kitchen, but an IKEA kitchen, to take advantage of the fact they are standardized, affordable, and available around the world for other robot researchers to benchmark against.

Most of us can experiment in a kitchen, IKEA or not. We have access to all the other tools we need: affordable AI hardware from Google, from Beaglebone, and from Nvidia. And we certainly have no shortage of robot arms and manipulators on these pages, ranging from a small laser-cut MeArm to our 2018 Hackaday Prize winner Dexter.

AI At The Edge Hack Chat

Join us Wednesday at noon Pacific time for the AI at the Edge Hack Chat with John Welsh from NVIDIA!

Machine learning was once the business of big iron like IBM’s Watson or the nearly limitless computing power of the cloud. But the power in AI is moving away from data centers to the edge, where IoT devices are doing things once unheard of. Embedded systems capable of running modern AI workloads are now cheap enough for almost any hacker to afford, opening the door to applications and capabilities that were once only science fiction dreams.

John Welsh is a Developer Technology Engineer with NVIDIA, a leading company in the Edge computing space. He’ll be dropping by the Hack Chat to discuss NVIDIA’s Edge offerings, like the Jetson Nano we recently reviewed. Join us as we discuss NVIDIA’s complete Jetson embedded AI product line up, getting started with Edge AI, and where Edge AI is headed.

join-hack-chat

Our Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, May 1 at noon Pacific time. If time zones have got you down, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.