A PCB with several points highlighted by a projection system

Augmented Reality Workbench Helps You To Debug Your Boards

No matter how advanced your design skills, the chances are you’ll need to spend some time chasing bugs in your boards after they come back from the assembly house. Testing and debugging a PCB typically involves a lot of cross-checking between the board, the layout and the schematic, which quickly becomes tiresome even for mildly complex designs. To make this task a bit easier, [Ishan Chatterjee] and colleagues at the University of Washington have designed the Augmented Reality Debugging Workbench, or ARDW for short.

The ARDW is a setup consisting of a lab workbench with an antistatic mat, a selection of measurement instruments and a PC. You can simply place your board on the bench, open the schematic and layout in KiCAD and start measuring and debugging your design as you normally would, but the real magic happens when you select a new icon in KiCAD that exports the schematic and layout to the ARDW system. From that moment, you can select components in your schematic and have them highlighted not only on the layout, but on the physical board in front of you as well. This is perhaps best demonstrated visually, as the team members do in the video embedded below.

The real-life highlighting of components is achieved thanks to a set of cameras that track the motion of everything on the desk as well as a video projector that overlays information on top of the PCB. All of this enables a variety of useful debugging features: for example, there’s an option to highlight pin one on all components, enabling a simple visual check of each component’s orientation. You can select all Do Not Populate (DNP) instances and immediately see if all highlighted pads are empty. If you’re not sure which component you’re looking at, just point at it with your multimeter probe and it’s highlighted on the schematic and layout. You can even place your probes on a net and automatically log the voltage for future reference, thanks to a digital link between the multimeter and the ARDW software.

In addition to designing and building the ARDW, the team also performed a usability study using a group of human test subjects. They especially liked the ability to quickly locate components on crowded boards, but found the on-line measurement system a bit cumbersome due to its limited positional accuracy. Future work will therefore focus on improving the resolution of the projected image and generally making the system more compact and robust. All software is freely available on the project’s GitHub page, and while the current system looks a little complex for hobbyist use, we can already imagine it being a useful tool in production environments.

It’s not even the first time augmented reality has been used for PCB debugging: we saw a somewhat similar system at the 2019 Hackaday Superconference. AR can also come in handy during the design and prototyping phase, as demonstrated by this AR breadboard.

Continue reading “Augmented Reality Workbench Helps You To Debug Your Boards”

ESP32-Cam Makes A Dandy Motion Detector

Halloween is right around the corner and just about every Halloween project needs some kind of motion sensor. Historically, we’ve used IR and ultrasonic sensors but [Makers Mashup] decided to use an ESP32-Cam as a motion sensor in his latest animatronic creation. You can see a video of the device and how it works below.

The project is a skull that follows you around with a few degrees of motion on a stepper motor. There’s a 3D-printed enclosure to make the hardware assembly easy. The base software was borrowed from [Eloquent Arduino].

Continue reading “ESP32-Cam Makes A Dandy Motion Detector”

Motion Tracking Face Really Does Follow You Around The Room

Many of us have had the experience of viewing an artwork in a gallery, in which the eyes appear to follow one around the room. In our high-technology work, this no longer need be achieved with artistic skill. You can just build something that actually moves instead.

Chartreuse is the creation of [alynton], and has a personality all its own. A face was created out of laser cut wood, and assembled layer by layer. It was then given glowing LED eyes, and mounted on a rotating plate. Combined with an Arduino and an ultrasonic sensor, it’s capable of tracking targets moving within its field of view, and rotating to follow them. Chartreuse’s expression changes as well, with from happy to forlorn, depending on the situation.

It’s a great example of the artistic results that can be achieved by layering lasercut materials, as well as how art can be brought to life with simple maker staples like servos and microcontrollers. Motion tracking has plenty of useful applications, too – like aiming heat directly at cold humans. Video after the break.

Continue reading “Motion Tracking Face Really Does Follow You Around The Room”

Behold The Giant Eye’s Orrery-Like Iris And Pupil Mechanism

This is an older project, but the electromechanical solution used to create this giant, staring eyeball is worth a peek. [Richard] and [Anton] needed a big, unblinking eyeball that could look in any direction and their solution even provides an adjustable pupil and iris size. Making the pupil dilate or contract on demand is a really nice feature, as well.

The huge fabric sphere is lit from the inside with a light bulb at the center, and the iris and pupil mechanism orbit the bulb like parts of an orrery. By keeping the bulb in the center and orbiting the blue gel (for the iris) and the opaque disk (for the pupil) around the bulb, the eye can appear to gaze in different directions. By adjusting the distance of the disks from the bulb, the size of the iris and pupil can be changed.

A camera system picks out objects (like people) and directs the eye to gaze at them. The system is clever, but the implementation is not perfect. As you can see in the short video embedded below, detection of a person walking by lags badly. Also, there are oscillations present in the motion of the iris and pupil. Still, as a mechanism it’s a beauty.

Continue reading “Behold The Giant Eye’s Orrery-Like Iris And Pupil Mechanism”

Cheap And Easy Motion Tracking

[Koppany Horvarth] set out to create a dirt-cheap optical tracking rig for VR that uses only two cameras and a certain amount of math to do its thing. He knew he could do theoretically, and wouldn’t cost a lot of money, but still required a lot of work and slightly absurd amount of math.

While playing around with a webcam that he’d set up to run an object-tracking Python script and discovered that his setup tended to display a translucent object with a LED inside of it as pure, washed-out white. This gave [Koppany] the idea that he could use such a light as part of his object tracking project. He 3D-printed 50mm hollow spheres out of transparent PLA, illuminated via a LED and powered by a 5V power supply hacked from an old USB cable. After dealing with some lens flares, he sanded down the PLA a little to diffuse the light and it worked like a charm.

To learn more check out his GitHub code repository. You can also take inspiration in some of the other motion tracking posts we’ve published in the past, like motion tracking on the cheap with a PIC and this OpenCV Airsoft turret.

Ping Pong Ball Improves The Google Daydream Controller

[Matteo] has just released a new installment of his Google Daydream VR controller hack, which we first covered last year (when he got it working with iOS). This time around he’s managed to forge a half Daydream, half PlayStation Move controller hybrid.

The original controller only managed a mere 3 DOF (Degrees of Freedom) using the internal accelerometer; although this conveyed rotational motion around the 3 axis, transitional information was completely lacking. [Matteo] resolves this by forming a simple positional marker out of a white LED enclosed in a standard ping pong ball; He tracks this setup using an iSight camera.

To gel everything together, he adds motion tracking to his already extensively developed software stack, which enables him to unshackle the Daydream controller from Android. He deciphers the Bluetooth packets and streams the sensory information straight to a web browser over a webSocket connection.

sf-stack

The results are quite impressive and the tracking is smooth. Not only does this add to the final goal of hacking his way towards a platform independent VR motion controller, he aptly gets some inspiration from Sony, extends Google’s hardware and even manages to use Apple’s webcam along the way. How’s that for carving passages between the walled gardens of consumer electronics?

Continue reading “Ping Pong Ball Improves The Google Daydream Controller”

The Power Glove Ultra Is The Power Glove We Finally Deserve

How do you make the most awesome gaming peripheral ever made even more bad? Give it a 21st-century upgrade! [Alessio Cosenza] calls this mod the Power Glove Ultra, and it works exactly as we imagined it should have all those years ago.

The most noticeable change is the 3D-printed attachment that hosts the Bluetooth module, a combination USB charger and voltage booster, and a Metro Mini(ATmega328) board. On top of a 20-hour battery life, a 9-axis accelerometer, gyroscope, and compass gives the Power Glove Ultra full 360-degree motion tracking and upgrades the functionality of the finger sensors with a custom board and five flex sensor strips with 256 possible positions for far more nuanced input. [Cosenza] has deliberately left the boards and wires exposed for that cyberpunk, retro-future look that is so, so bad.

Continue reading “The Power Glove Ultra Is The Power Glove We Finally Deserve”