Enhance Your Enclosures With A Shadow Line

Some design techniques and concepts from the injection molding world apply very nicely to 3D printing, despite them being fundamentally different processes. [Teaching Tech] demonstrates designing shadow lines into 3D printed parts whose surfaces are intended to mate up to one another.

This is a feature mainly seen in enclosures, and you’ve definitely seen it in all kinds of off-the-shelf products. Essentially, one half of the part has a slight “underbite” of a rim, and the other half has a slight “overbite”, with a bit of a standoff between the two. When placed together, the combination helps parts self-locate to one another, as well as providing a consistent appearance around the mating surfaces.

Why is this necessary? When a plastic part is made — such as an enclosure in two halves — the resulting surfaces are never truly flat. Without post-processing, the two not-quite-flat surfaces result in an inconsistent line with a varying gap between them.

By designing in a shadow line, the two parts will not only self-locate to each other for assembly, but will appear as a much more consistent fit. There will be a clear line between the two parts, but no actual visible gaps between them. Watch the whole thing explained in the video, embedded below.

This isn’t the only time design techniques from the world of injection molding have migrated to 3D printing. Crush ribs have been adapted to the world of 3D printed parts and are a tried-and-true solution to the problem of reliably obtaining a tight fit between plastic parts and hardware inserts.

Continue reading “Enhance Your Enclosures With A Shadow Line”

Got Fireflies? Try Talking To Them With A Green LED

[ChrisMentrek] shares a design for a simple green LED signal light intended for experiments in “talking” to fireflies. The device uses simple components like PVC piping and connectors to make something that resembles a signal flashlight with a momentary switch — a device simple enough to make in time for a little weekend experimenting.

Observe and repeat flashing patterns, and see if any fireflies get curious enough to investigate.

Did you know that fireflies, a type of beetle whose lower abdomen can light up thanks to a chemical reaction, flash in patterns? Many creatures, fireflies included, are quite curious under the right circumstances. The idea is to observe some fireflies and attempt to flash the same patterns (or different ones!) with a green LED to see if any come and investigate.

[ChrisMentrek] recommends using a green LED that outputs 565 nm, because that is very close to the colors emitted by most fireflies in North America. There’s also a handy link about firefly flashing patterns from the Massachusetts Audubon society’s Firefly Watch program, which is a great resource for budding scientists.

If staying up and learning more about nocturnal nightlife is your thing, then in between trying to talk to fireflies we recommend listening for bats as another fun activity, although it requires a bit more than just a green LED. Intrigued? Good news, because we can tell you all about the different kinds of bat detectors and what you can expect from them.

Browser-Based Robot Dog Simulator In ~800 Lines Of Code

[Sergii] has been learning about robot simulation and wrote up a basic simulator for a robodog platform: the Unitree A1. It only took about 800 lines of code to do so, which probably makes it a good place to start if one is headed in a similar direction.

Right now, [Sergii]’s simulator is an interactive physics model than runs in the browser. Software-wise, once the model of the robot exists the Rapier JavaScript physics engine takes care of the physics simulation. The robot’s physical layout comes from the manufacturer’s repository, so it doesn’t need to be created from scratch.

To make the tool useful, the application has two models of the robot, side by side. The one on the left is the control model, and has interactive sliders for limb positions. All movements on the control model are transmitted to the model on the right, which is the simulation model, setting the pose. The simulation model is the one that actually models the physics and gravity of all the desired motions and positions. [Sergii]’s next step is to use the simulator to design and implement a simple walking gait controller, and we look forward to how that turns out.

If Unitree sounds familiar to you, it might be because we recently covered how an unofficial SDK was able to open up some otherwise-unavailable features on the robodogs, so check that out if you want to get a little more out of what you paid for.

Making A Kid-Scale Apollo 11 Lunar Lander

If you’d like to see what goes into making a 1/3-scale Apollo 11 Lunar Module, [Plasanator]’s photos and build details will show off how he constructed one for a kid’s event that was a hit!

The photo gallery gives plenty of ideas about how one would approach a project like this, and readers will surely appreciate the use of an old frying pan as a concrete mold to create the lander’s “feet”. Later, a little paint makes the frying pan become a pseudo-antenna mounted on the lander’s exterior.

Inside, the lander has a control panel with a lot of arcade-style buttons and LED lighting. It’s pretty simple stuff, but livens things up a lot. Bright red lighting for the engine combined with a couple of slow strobe lights really makes it come alive in the dark. The gold foil? Emergency thermal blankets wrapped around the frame.

We happen to have the perfect chaser for this kid-scale lunar module: the Apollo 11 moon landing, recreated with animatronics and LEGO.

Continue reading “Making A Kid-Scale Apollo 11 Lunar Lander”

DIY Eye Tracking For VR Headsets, From A To Z

Eye tracking is a useful feature in social virtual reality (VR) spaces because it really enhances presence and communication when one’s avatar has a realistic gaze. Most headsets lack this feature, but EyeTrackVR has a completely open source solution ready for anyone willing to put it together.

Camera is visible in lower right corner.

EyeTrackVR is a combination of hardware, software, and 3D printable mounts for attaching a pair of microcontroller boards, cameras, and IR LEDs to just about any existing VR headset out there. An ESP32-based board and tiny camera module watches each eyeball, and under IR illumination the pupil presents as an easily-identified round black area. Software takes care of turning the camera’s view of the pupil into a gaze direction value that can be plugged into other software.

The project is still under active development, but in its current state is perfectly suitable for creating a functional system that can integrate into a variety of existing headsets with printed mounting brackets. Interested? Check out the intro and if it sounds up your alley, dive into the build guide which spells out everything you need to know. Check out the video below for a demo of EyeTrackVR working in VRChat, along with an overview of software support.

We’ve seen headsets built to custom specs that integrate eye tracking, but even if one is repackaging an existing headset that’s a perfect opportunity to include this feature.

Continue reading “DIY Eye Tracking For VR Headsets, From A To Z”

Making Your Own VR Headset? Consider This DIY Lens Design

Lenses are a necessary part of any head-mounted display, but unfortunately, they aren’t always easy to source. Taking them out of an existing headset is one option, but one may wish for a more customized approach, and that’s where [WalkerDev]’s homebrewed “pancake” lenses might come in handy.

Engineering is all about trade-offs, and that’s especially true in VR headset design. Pancake lenses are compact units that rely on polarization to bounce light around internally, resulting in a very compact assembly at the cost of relatively poor light efficiency. That compactness is what [WalkerDev] found attractive, and in the process discovered that stacking two different Fresnel lenses and putting them in a 3D printed housing yielded a very compact pancake-like unit that gave encouraging results.

This project is still in development, and while the original lens assembly is detailed in this build log, there are some potential improvements to be made, so stay tuned if you’re interested in using this design. A DIY headset doesn’t mean you also must DIY the lenses entirely from scratch, and this option seems economical enough to warrant following up.

Want to experiment with mixing and matching optics on your own? Not only has [WalkerDev]’s project shown that off-the-shelf Fresnel lenses can be put to use, it’s in a way good news that phone-based VR is dead. Google shipped over 10 million cardboard headsets and Gear VR sold over 5 million units, which means there are a whole lot of lenses in empty headsets laying around, waiting to be harvested and repurposed.

Pixel Pump, The Open Source Vacuum Pickup Tool Is Now Shipping

The Pixel Pump is an open source manual pick & place assist tool by [Robin Reiter], and after a long road to completion, it’s ready to ship. We first saw the Pixel Pump project as an entry to the 2021 Hackaday Prize and liked the clean design and the concept of a completely open architecture for a tool that is so valuable to desktop assembly. It’s not easy getting hardware off the ground, but it’s now over the finish line and nearly everything — from assembly to packaging — has been done in-house.

Pixel Pump with SMD-Magazines, also using foot pedal to control an interactive bill of materials (BoM) plugin.

Because having parts organized and available is every bit as important as the tool itself, a useful-looking companion item for the Pixel Pump is the SMD-Magazine. This is a container for parts that come on SMD tape rolls. These hold components at an optimal angle for use with the pickup tool, and can be fixed together on a rail to create project-specific part groups.

A tool being open source means giving folks a way to modify or add features for better workflows, and an example of this is [Robin]’s suggestion of using a foot pedal for hands-free control of the interactive BoM plugin. With it, one can simply use a foot pedal to step through a highlighted list of every part for a design, an invaluable visual aid when doing hand assembly.

The Pixel Pump looks great, but if you’d prefer to go the DIY route for vacuum pickup tools you would certainly be in good company. We’ve seen economical systems built for under $100, and systems built around leveraging bead-handling tools intended for hobbyists. On the extreme end there’s the minimalist approach of building a tool directly around a small electric vacuum pump.