Robot Maps Rooms With Help From IPhone

The Unity engine has been around since Apple started using Intel chips, and has made quite a splash in the gaming world. Unity allows developers to create 2D and 3D games, but there are some other interesting applications of this gaming engine as well. For example, [matthewhallberg] used it to build a robot that can map rooms in 3D.

The impetus for this project was a robotics company that used a series of robots around their business. The robots navigate using computer vision, but couldn’t map the rooms from scratch. They hired [matthewhallberg] to tackle this problem, and this robot is a preliminary result. Using the Unity engine and an iPhone, the robot can perform in one of three modes. The first is a user-controlled mode, the second is object following, and the third is 3D mapping.

The robot seems fairly easy to construct and only carries and iPhone, a Node MCU, some motors, and a battery. Most of the computational work is done remotely, with the robot simply receiving its movement commands from another computer. There’s a lot going on here, software-wise, and a lot of toolkits and software packages to install and communicate with one another, but the video below does a good job of showing what you’ll need and how it all works together. If that’s all too much, there are other robots with a form of computer vision that can get you started into the world of computer vision and mapping.

Continue reading “Robot Maps Rooms With Help From IPhone”

Leap Motion Announces Open Source Augmented Reality Headset

Leap Motion just dropped what may be the biggest tease in Augmented and Virtual Reality since Google Cardboard. The North Star is an augmented reality head-mounted display that boasts some impressive specs:

  • Dual 1600×1440 LCDs
  • 120Hz refresh rate
  • 100 degree FOV
  • Cost under $100 (in volume)
  • Open Source Hardware
  • Built-in Leap Motion camera for precise hand tracking

Yes, you read that last line correctly. The North Star will be open source hardware. Leap Motion is planning to drop all the hardware information next week.

Now that we’ve got you excited, let’s mention what the North Star is not — it’s not a consumer device. Leap Motion’s idea here was to create a platform for developing Augmented Reality experiences — the user interface and interaction aspects. To that end, they built the best head-mounted display they could on a budget. The company started with standard 5.5″ cell phone displays, which made for an incredibly high resolution but low framerate (50 Hz) device. It was also large and completely unpractical.

The current iteration of the North Star uses much smaller displays, which results in a higher frame rate and a better overall experience.  The secret sauce seems to be Leap’s use of ellipsoidal mirrors to achieve a large FOV while maintaining focus.

We’re excited, but also a bit wary of the $100 price point — Leap Motion is quick to note that the price is “in volume”. They also mention using diamond tipped tooling in a vibration isolated lathe to grind the mirrors down. If Leap hasn’t invested in some injection molding, those parts are going to make the whole thing expensive. Keep your eyes on the blog here for more information as soon as we have it!

HandHolo: A Homebrew ARG

Taking a dive into VR or augmented reality — once, dreamed-of science fiction — is not only possible for the average consumer, but crafting those experiences is as well! Hackaday.io user [kvtoet]’s HandHolo is a homebrew method to cut your teeth on peeking into a virtual world.

This project requires a smartphone running Android Oreo as its backbone, a Bluetooth mouse, a piece of cardboard and a small mirror or highly reflective surface. The phone is slotted into the cardboard housing — prototype with what you have! — above the mouse, and the mirror angled opposite the screen reflects the image back to the user as they explore the virtual scene.

Within Unity, [kvtoet]’s used a few scripts that access phone functions — namely the gyroscope, which is synchronised to the mouse’s movements. That movement is translated into exploration of the virtual space built in Unity and projected onto the portal-like mirror. Check it out!

Continue reading “HandHolo: A Homebrew ARG”

Magic Leap Finally Announced; Remains Mysterious

Yesterday Magic Leap announced that it will ship developer edition hardware in 2018. The company is best known for raising a lot of money. That’s only partially a joke, since the teased hardware has remained very mysterious and never been revealed, yet they have managed to raise nearly $2 billion through four rounds of funding (three of them raising more than $500 million each).

The announcement launched Magic Leap One — subtitled the Creator Edition — with a mailing list sign up for “designers, developers and creatives”. The gist is that the first round of hardware will be offered for sale to people who will write applications and create uses for the Magic Leap One.

We’ve gathered some info about the hardware, but we’ll certainly begin the guessing game on the specifics below. The one mystery that has been solved is how this technology is delivered: as a pair of goggles attaching to a dedicated processing unit. How does it stack up to current offerings?

Continue reading “Magic Leap Finally Announced; Remains Mysterious”

Home Brew Augmented Reality

In July of 2016 a game was released that quickly spread to every corner of the planet. Pokemon Go was an Augmented Reality game that used a smart phone’s GPS location and camera to place virtual creatures into the person’s real location. The game was praised for its creativity and was one of the most popular and profitable apps in 2016. It’s been download over 500 million times since.

Most of its users were probably unaware that they were flirting with a new and upcoming technology called Augmented Reality. A few day ago, [floz] submitted to us a blog from a student who is clearly very aware of what this technology is and what it can do. So aware in fact that they made their own Augmented Reality system with Python and OpenCV.

In the first part of a multi-part series – the student (we don’t know their name) walks you through the basic structure of making a virtual object appear on a real world object through a camera. He 0r she gets into some fairly dense math, so you might want to wait until you have a spare hour or two before digging into this one.

Thanks to [floz] for the tip!

Hackaday Prize Entry: Telepresence With The Black Mirror Project

The future is VR, or at least that’s what it was two years ago. Until then, there’s still plenty of time to experiment with virtual worlds, the Metaverse, and other high-concept sci-fi tropes from the 80s and 90s. Interactive telepresence is what the Black Mirror Project is all about. Their plan is to create interactive software based on JanusVR platform for creating immersive VR experiences.

The Black Mirror project makes use of the glTF runtime 3D asset delivery to create an environment ranging from simple telepresence to the mind-bending realities the team unabashedly compares to [Neal Stephenson]’s Metaverse.

For their hardware implementation, the team is looking at UDOO X86 single-board computers, with SSDs for data storage as well as a bevy of sensors — gesture, light, accelerometer, magnetometer — supplying the computer with data. There’s an Intel RealSense camera in the build, and the display is unlike any other VR setup we’ve seen before. It’s a tensor display with multiple projection planes and variable backlighting that has a greater depth of field and wider field of view than almost any other display.

Hackaday Prize Entry: SNAP Is Almost Geordi La Forge’s Visor

Echolocation projects typically rely on inexpensive distance sensors and the human brain to do most of the processing. The team creating SNAP: Augmented Echolocation are using much stronger computational power to translate robotic vision into a 3D soundscape.

The SNAP team starts with an Intel RealSense R200. The first part of the processing happens here because it outputs a depth map which takes the heavy lifting out of robotic vision. From here, an AAEON Up board, packaged with the RealSense, takes the depth map and associates sound with the objects in the field of view.

Binaural sound generation is a feat in itself and works on the principle that our brains process incoming sound from both ears to understand where a sound originates. Our eyes do the same thing. We are bilateral creatures so using two ears or two eyes to understand our environment is already part of the human operating system.

In the video after the break, we see a demonstration where the wearer doesn’t need to move his head to realize what is happening in front of him. Instead of a single distance reading, where the wearer must systematically scan the area, the wearer simply has to be pointed the right way.

Another Assistive Technology entry used the traditional ultrasonic distance sensor instead of robotic vision. There is even a version out there for augmented humans with magnet implants covered in Cyberpunk Yourself called Bottlenose.

Continue reading “Hackaday Prize Entry: SNAP Is Almost Geordi La Forge’s Visor”