How To Model A Twisted Part In FreeCAD

Quick references are handy, but sometimes it’s nice to have a process demonstrated from beginning to end. In that spirit, [Darren Stone] created a video demonstrating how to model a twisted part in FreeCAD, showing the entire workflow of creating the part as a blend of surfaces and curves that get turned into a solid.

FreeCAD is organized using the concept of multiple “workbenches” which are each optimized for different tools and operations, and [Darren] walks through doing the same jobs in a few different ways.

This twisted bracket is a simple part that is nevertheless nontrivial from a CAD perspective, and that makes it a good candidate for showing off the different workbenches and tools.

The video below is also pretty good overall demonstration of what designing a part from a mechanical drawing looks like when done in FreeCAD. As for mechanical drawings themselves, we’ve seen FreeCAD can be used to make those, too.

Continue reading “How To Model A Twisted Part In FreeCAD”

CNC Feeds And Speeds, Explained As A First-Timer

If you’ve ever looked into CNC cutting tools, you’ve probably heard the term “feeds and speeds”. It refers to choosing the speed at which to spin the cutting tool, and how fast to plow it into the material being cut. They’re important to get right, and some of the reasons aren’t obvious. This led [Callan Bryant] to share his learned insights as a first-timer. It turns out there are excellent (and somewhat non-intuitive) reasons not to simply guess at the correct values!

A table of variables and how they relate to one another (click to enlarge).

The image above shows a tool damaged by overheating. [Callan] points out that as a novice, one might be inclined to approach a first cutting jobs conservatively, with a low feed rate. But doing this can have an unexpected consequence: a tool that overheats due to spinning too quickly while removing too little material.

CNC cutting creates a lot of heat from friction, and one way to remove that heat is by having the tool produce shavings, which help carry heat away. If a tool is making dust instead of shavings — for example if the feed rate is too conservative — the removed pieces will be too small to carry significant energy, and the tool can overheat.

[Callan] makes a table of variables at work in a CNC system in order to better understand their relationship before getting into making a formula for calculating reasonable feed and speed rates. Of course, such calculations are a reasonable starting point only, and it’s up to the operator to ensure things are happening as they should for any given situation. As our own Elliot Williams observed, CNC milling is a much more manual process than one might think.

ChatGPT Makes A 3D Model: The Secret Ingredient? Much Patience

ChatGPT is an AI large language model (LLM) which specializes in conversation. While using it, [Gil Meiri] discovered that one way to create models in FreeCAD is with Python scripting, and ChatGPT could be encouraged to create a 3D model of a plane in FreeCAD by expressing the model as a script. The result is just a basic plane shape, and it certainly took a lot of guidance on [Gil]’s part to make it happen, but it’s not bad for a tool that can’t see what it is doing.

The first step was getting ChatGPT to create code for a 10 mm cube, and plug that in FreeCAD to see the results. After that basic workflow was shown to work, [Gil] asked it to create a simple airplane shape. The resulting code had objects for wing, fuselage, and tail, but that’s about all that could be said because the result was almost — but not quite — completely unlike a plane. Not an encouraging start, but at least the basic building blocks were there. Continue reading “ChatGPT Makes A 3D Model: The Secret Ingredient? Much Patience”

Thermal Camera Plus Machine Learning Reads Passwords Off Keyboard Keys

An age-old vulnerability of physical keypads is visibly worn keys. For example, a number pad with digits clearly worn from repeated use provides an attacker with a clear starting point. The same concept can be applied to keyboards by using a thermal camera with the help of machine learning, but it also turns out that some types of keys and typing styles are harder to read than others.

Researchers at the University of Glasgow show how machine learning can pull details from thermal images like these quickly and effectively.

Touching a key with a fingertip imparts a slight amount of body heat, and that small amount of heat can be spotted by a thermal sensor. We’ve seen this basic approach used since at least 2005, and two things have changed since then: thermal cameras gotten much more common, and researchers discovered that by combining thermal readings with machine learning, it’s possible to eke out slight details too difficult or subtle to spot by human eye and judgement alone.

Here’s a link to the research and findings from the University of Glasgow, which shows how even a 16 symbol password can be attacked with an average accuracy of 55%. Shorter passwords are much easier to decipher, with the system attacking 6 and 8 symbol passwords with an accuracy between 92% and 80%, respectively. In the study, thermal readings were taken up to a full minute after the password was entered, but sooner readings result in higher accuracy.

A few things make things harder for the system. Fast typists spend less time touching keys, and therefore transfer less heat when they do, making things a little more challenging. Interestingly, the material of the keycaps plays a large role. ABS keycaps retain heat far more effectively than PBT (a material we often see in custom keyboard builds like this one.) It also turns out that the tiny amount of heat from LEDs in backlit keyboards runs effective interference when it comes to thermal readings.

Amusingly this kind of highly modern attack would be entirely useless against a scramblepad. Scramblepads are vintage devices that mix up which numbers go with which buttons each time the pad is used. Thermal imaging and machine learning would be able to tell which buttons were pressed and in what order, but that still wouldn’t help! A reminder that when it comes to security, tech does matter but fundamentals can matter more.

Hackaday Prize 2023: Eye Tracking On A Budget

There is a lot to be learned from the experience of building something functional, and even better if doing so doesn’t break the bank. [Sergej Stoetzer]’s 20€ DIY-Eyetracker aims to be an educational process that covers everything from hardware to functional software in an accessible way.

Hardware based on an economical USB endoscope, and can be used as-is or repackaged with IR illumination.

The eye tracker is based on an economical USB endoscope, which is a small camera optimized for up-close applications. By attaching the camera to a pair of common safety glasses so that it looks at one’s eye, some OpenCV and Python code can do simple tracking and interfacing with other projects.

Basic eye tracking — like determining whether a user is looking up, down, left, or right — can be all that’s needed depending on one’s application. That means that it’s possible to get something working with very little hardware and some easy-to-use OpenCV functions.

Even better performance can be had by adding IR illumination and repackaging the camera into a 3D printed enclosure. The pupil of the eye is an aperture in the iris that appears as a black circle, and that’s even more true under IR illumination which is invisible to the naked eye. If you’re curious about what’s inside those USB endoscope cameras and how to remove their IR filter, there are some good pictures of that process in this project.

The ability to get something prototyped quickly and working well enough to learn new things is a valuable skill, and that’s why re-engineering Education is one of the challenges in the 2023 Hackaday Prize.

Embed Hardware Into 3D Prints, But Not In The Way You’re Thinking

[Christopher Helmke] is doing fantastic work in DIY systems for handling small hardware like fasteners, and that includes robotic placement of hardware into 3D prints. Usually this means dropping nuts into parts in mid-print so that the hardware is captive, but that’s not really the story here.

The really inventive part we want to highlight is the concept of reducing packaging and labor. Instead of including a zip-lock bag of a few bolts, how about embedding the bolts into a void in the 3D print, covered with a little snip-out retainer? Skip ahead to 1:54 in the video to see exactly what we mean. It’s a pretty compelling concept that we hope sparks a few ideas in others.

As clever as that concept is, the rest of the video is also worth a watch because [Christopher] shows off a DIY system that sits on top of his 3D printer and takes care of robotically placing the hardware in mid-print. He talks all about the challenges of such a system. It’s not perfect (yet), but seeing it in action is very cool.

We’ve recently seen a lot of fascinating stuff when it comes to [Christopher Helmke]’s automated handling of fasteners and similar hardware. His system makes rapid and accurate dispensing of bolts look easy, and his work on using compressed air to zip pieces around seems effective.

Continue reading “Embed Hardware Into 3D Prints, But Not In The Way You’re Thinking”

Tactile Feedback In VR, No Cumbersome Gloves Or Motors Required

This clever research from the University of Chicago’s Human Computer Integration Lab demonstrates a fascinating way to let users “feel” objects in VR, without anything getting in the way of using one’s hands and fingers normally. Certainly, the picture here shows hands with a device attached to them, but look closely and you’ll see that it’s on the back of the hand only.

There’s hardware attached to the hands, yes, but only to the backs. Hands and fingers can be used entirely normally while receiving tactile feedback.

The unique device consists of a control box, wires, and some electrodes attached to different spots on the back of the hand and fingers. Carefully modulated electrical signals create tactile sensations on the front, despite originating from electrodes on the back. While this has clear applications for VR, the team thinks the concept could also have applications in rehabilitation, or prosthetics.

Continue reading “Tactile Feedback In VR, No Cumbersome Gloves Or Motors Required”