Reachy The Open Source Robot Says Bonjour

Humanoid robots always attract attention, but anyone who tries to build one quickly learns respect for a form factor we take for granted because we were born with it. Pollen Robotics wants to help move the field forward with Reachy: a robot platform available both as a product and as a wealth of information shared online.

This French team has released open source robots before. We’ve looked at their Poppy robot and see a strong family resemblance with Reachy. Poppy was a very ambitious design with both arms and legs, but it could only ever walk with assistance. In contrast Reachy focuses on just the upper body. One of the most interesting innovations is found in Reachy’s neck, a cleverly designed 3 DOF mechanism they called Orbita. Combined with two moving antennae at the top of the head, Reachy can emote a wide range of expressions despite not having much of a face. The remainder of Reachy’s joints are articulated with Dynamixel serial bus servos though we see an optional Orbita-based hand attachment in the demo video (embedded below).

Reachy’s € 19,990 price tag may be affordable relative to industrial robots, but it’s pretty steep for the home hacker. No need to fret, those of us with smaller bank accounts can still join the fun because Pollen Robotics has open sourced a lot of Reachy details. Digging into this information, we see Reachy has a Google Coral for accelerating TensorFlow and a Raspberry Pi 4 for general computation. Mechanical designs are released via web-based Onshape CAD. Reachy’s software suite on GitHub is primarily focused on Python, which allows us to experiment within a Jupyter notebook. Simulation can be done within Unity 3D game engine, which can be optionally compiled to run in a browser like the simulation playground. But academic robotics researchers are not excluded from the fun, as ROS1 integration is also available though ROS2 support is still on the to-do list.

Reachy might not be as sophisticated as some humanoid designs we’ve seen, and without a lower body there’s no way for it to dance. But we are very appreciative of a company willing to share knowledge with the world. May it spark new ideas for the future.

[via Engadget]

Continue reading “Reachy The Open Source Robot Says Bonjour”

You’re The Only One Not Playing With Unity

It wasn’t too long ago that one could conjecture that most hackers are not avid video game players. We spend most of our free time taking things apart, tinkering with microcontrollers and reading the latest [Jenny List] article on Hackaday.com. When we do think of video games, our neurons generally fire in the direction of emulating a console on a single board computer, such as a Raspberry Pi or a Beaglebone. Or even emulating the actual console processor on an FPGA. Rarely do we venture off into 3D programs meant to make modern video games. If we can’t export an .STL with it, we’re not interested. It’s just not our bag.

Oculus Rift changed this. The VR headset was originally invented for 3D video games, but quickly became a darling to hackers the world over. Virtual Reality technology is far bigger than just video games, and brings opportunity to many fields such as real estate, construction, product visualization, education, social interaction… the list goes on and on.

The Oculus team got together with the folks over at Unity in the early days to make it easy for video game makers to make content for the Rift. Unity is a game engine designed with a shallow learning curve and is available for free for non-commercial use. The Oculus Rift can be integrated into a Unity environment with the check of a setting and importing a small package, available on the Oculus site. This makes it easy for anyone interested in VR technology to get a Rift and start pumping out content.

Hackers have taken things a step further and have written scripts that allow Unity to communicate with an Arduino. VR is fun. But VR plus physical reality is just down right exciting! In this article, we’re going to walk you through setting up your Oculus Rift and Unity game engine to communicate with the outside world via an Arduino.

Continue reading “You’re The Only One Not Playing With Unity”

The Ninja Run: A VR Movement Experiment

VR is an area that is seeing plenty of DIY experimentation, and [FultonX] has an interesting hack of sorts in that he’s discovered something that meshes well with how we perceive motion and movement. It’s an experimental movement system for VR he calls the Ninja Run, and it somewhat resembles skiing.

ninja-run-analysis-optimizedEven room-scale VR suffers from the fact that the player is more or less stuck in one place. Moving the player from one spot to another isn’t currently a gracefully solved problem, and many existing methods are not immersive or have other drawbacks. One solution in use is a sort of teleportation, another “slides” the player to another area on command (like gliding across ice). [FultonX] found these existing solutions lacking, and prototyped the Ninja Run concept which he found was surprisingly intuitive and effective. Video demo embedded below.

Continue reading “The Ninja Run: A VR Movement Experiment”

Hexapod Tank From Ghost In The Shell Brought To Life

Every now and then someone gets seriously inspired, and that urge just doesn’t go away until something gets created. For [Paulius Liekis], it led to creating a roughly 1:20 scale version of the T08A2 Hexapod “Spider” Tank from the movie Ghost in the Shell. As the he puts it, “[T]his was something that I wanted to build for a long time and I just had to get it out of my system.” It uses two Raspberry Pi computers, 28 servo motors, and required over 250 hours of 3D printing for all the meticulously modeled pieces – and even more than that for polishing, filing, painting, and other finishing work on the pieces after they were printed. The paint job is spectacular, with great-looking wear and tear. It’s even better seeing it in motion — see the video embedded below.

Continue reading “Hexapod Tank From Ghost In The Shell Brought To Life”

Real-time Driving Of RGB LED Cube Using Unity3D

RGB LED cubes are great, but building the cube is only half the battle – they also need to be driven. The larger the cube, the bigger the canvas you have to exercise your performance art, and the more intense the data visualization headache. This project solves the problem by using Unity to drive an RGB LED cube in real-time.

Landscape animation RGB Cube - smallWe’re not just talking about driving the LEDs themselves at a low level, but how you what you want to display in each of those 512 pixels.

In the video, you can see [TylerTimoJ]’s demo of an 8x8x8 cube being driven in real-time using the Unity engine. A variety of methods are demonstrated from turning individual LEDs on and off, coloring swaths of the cube as though with a paintbrush, and even having the cube display source image data in real-time (as shown on the left.)

Continue reading “Real-time Driving Of RGB LED Cube Using Unity3D”

Amazing Oscilloscope Graphics

From what we can understand, [ompuco] has built a 2D audio output on top of the Unity game engine, enabling him to output X and Y values from his stereo soundcard straight to an oscilloscope in XY mode. His code simply scans through all the vertexes in the scene and outputs the right voltages into the left and right audio streams. He’s using this to create some pretty incredible animations. Check out the video “additives” below for an example. (See if you can figure out what’s being “added”.)

Continue reading “Amazing Oscilloscope Graphics”

Here Be Dragons, And VR…and Sheep.

dragonVR

This may qualify less as a hack and more as clever combination of video game input devices, but we thought it was well worth showing off. [Jack] and his team built Dragon Eyes from scratch at the 2013 Dundee Dare Jam. If you’re unfamiliar with “Game Jams” and have any aspirations of working in the video game industry, we highly recommend that you find one and participate. With only 48 hours to design, code, build assets and test, many teams struggle to finish their entry. Dragon Eyes, however, uses the indie-favorite game engine Unity3D to smoothly coordinate its input devices, allowing players to experience dragon flight. The Kinect reads the player’s arm positions (including flapping) to direct the wings for travel, while the Oculus Rift performs its usual job as immersive VR headgear.

Combining a Kinect and a Rift isn’t particularly uncommon, but the function of the microphone is. By blowing into a headset microphone, players activate the dragon’s fire-breathing. How’s that for interactivity? You can see [Jack] roasting some sheep in a demonstration video below. If you have a Kinect and Rift lying around and want some first-person dragon action, [Jack] has kindly provided a download of the build in the project link above.

We’re looking forward to more implementations of the Rift; we haven’t seen many just yet. You can, however, check out a Rift used as an aerial camera on a drone.

Continue reading “Here Be Dragons, And VR…and Sheep.”