Every time we check in on [Project326], he’s doing something different with X-rays. This week, he has a passive X-ray imager. On paper, it looks great. No special tube is required and no high voltage needed. Actually, no voltage is needed at all. Of course, there’s no free lunch. What it does take is a long time to produce an image.
While working on the “easy peasy X-ray machine,” dental X-ray film worked well for imaging with a weak X-ray source. He found that the film would also detect exposure to americium 241. So technically, not an X-ray in the strictest sense, but a radioactive image that uses gamma rays to expose the film. But to normal people, a picture of the inside of something is an X-ray even when it isn’t.
When it comes to FDM 3D prints and making them stronger, most of the focus is on the outer walls and factors like their layer adhesion. However, paying some attention to the often-ignored insides of a model can make a lot of difference in its mechanical properties. Inspired by a string of [Tom Stanton] videos, [3DJake] had a poke at making TPU more resilient against breaking when stretched and PLA resistant to snapping when experiencing a lateral force.
Simply twisting the TPU part massively increased the load at which it snapped. Similarly, by removing the infill from the PLA part before replacing it with a hollow cylinder, the test part also became significantly more resilient. A very noticeable result of hollowing out the PLA part: the way that it breaks. A part with infill will basically shatter. But the hollowed-out version remained more intact, rather than ripping apart at the seams. The reason? The hollow cylinder shape is printed to add more walls inside the part. Plus cylinders are naturally more able to distribute loads.
All of this touches on load distribution and designing a component to cope with expected loads in the best way possible. It’s also the reason why finite element analysis is such a big part of the CAD world, and something which we may see more of in the world of consumer 3D printing as well in the future.
Typeface (such as Times New Roman) refers to the design that gives a set of letters, numbers, and symbols their signature “look”. Font, on the other hand, is a specific implementation of a typeface, for example, Times New Roman Italic 12 pt.
‘Q’ is a counterpoint to the idea that typography is just one fussy detail after another.
Right about this point, some of you are nodding along and perhaps thinking “oh, that’s interesting,” while the rest of you are already hovering over your browser’s Back button. If you’re one of the former, you may be interested in checking out the (sort of) interactive tour of typography design elements by the Ohno Type School, a small group that loves design.
On one hand, letters are simple and readily recognizable symbols. But at the same time, their simplicity puts a lot of weight on seemingly minor elements. Small changes can have a big visual impact. The tour lays bare answers to questions such as: What is the optimal parting of the cheeks of a capital ‘B’? At what height should the crossbar on an ‘A’ sit, and why does it look so weird if done incorrectly? And yet, the tail of a ‘Q’ can be just about anything? How and why does an ‘H’ define the spacing of the entire typeface? All these (and more) are laid bare.
USB-C as the “One Cable To Rule Them All” has certainly been a success. While USB-A is still around for now, most of us have breathed a hefty sigh of relief with the passing of micro-USB and the several display and power standards it replaces. It’s not without its minor issues though. One of them is that it’s as susceptible as any other cable to a bit of strain. For that, we think [NordcaForm]’s 3D-printed USB-C cable strain relief is definitely a cut above the rest.
Waxing lyrical about a simple 3D printed model might seem overkill for Hackaday, and it’s true, it’s not something we do often, but as Hackaday writers travel around with plenty of USB-C connected peripherals, we like the design of this one. It’s flexible enough to be useful without resorting to exotic filaments, and since it’s available in a few different forms with curved or straight edges, we think it can find a place in many a cable setup. Certainly more of an everyday carry than a previously featured 3D print. If you want to learn more about USB C, we have a whole series of posts for you to binge read.
Bose SoundTouch speakers were introduced in 2013, offering the ability to connect to online streaming services and play back audio on multiple speakers simultaneously using the accompanying mobile app. Now these features are about to be removed, including the mobile app, as Bose is set to discontinue support on February 18, 2026. From that point onwards, you can only use them via Bluetooth or physical connectors that may be present, like an audio jack or HDMI port. This includes fancy home theater system hardware like the above SoundTouch 520.
That is the official line, at least. We have seen the SoundTouch on Hackaday previously, when it was discovered how to gain root shell access to the Linux OS that powers the original SoundTouch system with Telnet access on port 17,000 to pass the listening service the remote_services on command before connecting with Telnet as usual, with root and no password. A quick glance at the comments to that post suggests that this is still a valid approach for at least certain SoundTouch devices.
The fallout from this announcement appears to be twofold: most of all that ‘smart’ features like WiFi-based streaming can be dropped at any time. But it also makes us realize that hardware hackers like us will never run out of new and suddenly obsolete hardware that need our rescue.
CRT monitors: there’s nothing quite like ’em. But did you know that video projectors used to use CRTs? A trio of monochrome CRTs, in fact: one for each color; red, green, and blue. By their powers combined, these monsters were capable of fantastic resolution and image quality. Despite being nowhere near as bright as modern projectors, after being properly set up, [Technology Connections] says it’s still one of the best projected images he has seen outside of a movie theatre.
After a twenty-minute startup to reach thermal equilibrium, one can settle down with a chunky service manual for a ponderous calibration process involving an enormous remote control. The reward is a fantastic (albeit brightness-limited) picture.
Still, these projectors had drawbacks. They were limited in brightness, of course. But they were also complex, labor-intensive beasts to set up and calibrate. On the other hand, at least they were heavy.
[Technology Connections] gives us a good look at the Sony VPH-D50HT Mark II CRT Projector in its tri-lobed, liquid-cooled glory. This model is a relic by today’s standards, but natively supports 1080i via component video input and even preserves image quality and resolution by reshaping the image in each CRT to perform things like keystone correction, thus compensating for projection angle right at the source. Being an analog device, there is no hint of screen door effect or any other digital artifact. The picture is just there, limited only by the specks of phosphor on the face of each tube.
Converging and calibrating three separate projectors really was a nontrivial undertaking. There are some similarities to the big screen rear-projection TVs of the 90s and early 2000s (which were then displaced by plasma and flat-panel LCD displays). Unlike enclosed rear-projection TVs, the screen for projectors was not fixed, which meant all that calibration needed to be done on-site. A walkthrough of what that process was like — done with the help of many test patterns and a remote control that is as monstrous as it is confusing — starts at 15:35 in the video below.
Like rear-projection TVs, these projectors were displaced by newer technologies that were lighter, brighter, and easier to use. Still, just like other CRT displays, there was nothing quite like them. And if you find esoteric projector technologies intriguing, we have a feeling you will love the Eidophor.
As Ethernet became the world-wide standard for wired networking, there was one nagging problem. You already have to plug in the network cable. But then you have to also plug in a power cable. That power cable needs to be long enough. And have the right plug on it for your country. And provide the right current and voltage. That’s how Power over Ethernet (PoE) was born, first in a veritable Wild West of proprietary standards and passive injectors, then in a standardized process. Recently [T. K. Hareendran] wrote a primer on PoE, with more of a DIY intro focus, including some favorite PoE PD (powered device) chips to use in your own design.
You can still totally use passive PoE if that’s your jam, and you have full control over the network and any connected devices. This would allow you to, for example, power your SBCs for a couple of bucks, although for adding PoE to your Mac Mini you may want to look at some more refined options, if only as a safety precaution.