Product teardowns are great, but getting an unfiltered one from the people who actually designed and built the product is a rare treat. In the lengthy video after the break, former Formlabs engineer [Shane Wighton] tears down the Form 4 SLA printer while [Alec Rudd], the engineering lead for the project, answers all his prying questions.
[Shane] was part of the team that brought all Form 4’s predecessors to life, so he’s intimately familiar with the challenges of developing such a complex product. This means he can spot the small design details that most people would miss, and dive into the story behind each one. These include the hinges and poka-yoke (error-proofing) designed into the lid, the leveling features in the build-plate mount, the complex prototyping challenges behind the LCD panel and backlight, and the mounting features incorporated into every component.
A considerable portion of the engineering effort went into mitigating all the ways things could go wrong in production, shipping, and operation. The fact that most of the parts on the Form 4 are user-replaceable makes this even harder. It’s apparent that both engineers speak from a deep well of hard-earned experience, and it’s well worth the watch if you dream of bringing a physical product to market.
On November 6th, Northwestern University introduced a groundbreaking leap in haptic technology, and it’s worth every bit of attention now, even two weeks later. Full details are in their original article. This innovation brings tactile feedback into the future with a hexagonal matrix of 19 mini actuators embedded in a flexible silicone mesh. It’s the stuff of dreams for hackers and tinkerers looking for the next big thing in wearables.
What makes this patch truly cutting-edge? First, it offers multi-dimensional feedback: pressure, vibration, and twisting sensations—imagine a wearable that can nudge or twist your skin instead of just buzzing. Unlike the simple, one-note “buzzers” of old devices, this setup adds depth and realism to interactions. For those in the VR community or anyone keen on building sensory experiences, this is a game changer.
But the real kicker is its energy management. The patch incorporates a ‘bistable’ mechanism, meaning it stays in two stable positions without continuous power, saving energy by recycling elastic energy stored in the skin. Think of it like a rubber band that snaps back and releases stored energy during operation. The result? Longer battery life and efficient power usage—perfect for tinkering with extended use cases.
And it’s not all fun and games (though VR fans should rejoice). This patch turns sensory substitution into practical tech for the visually impaired, using LiDAR data and Bluetooth to transmit surroundings into tactile feedback. It’s like a white cane but integrated with data-rich, spatial awareness feedback—a boost for accessibility.
Fancy more stories like this? Earlier this year, we wrote about these lightweight haptic gloves—for those who notice, featuring a similar hexagonal array of 19 sensors—a pattern for success? You can read the original article on TechXplore here.
High-speed photography with the camera on a fast-moving robot arm has become all the rage at red-carpet events, but this GlamBOT setup comes with a hefty price tag. To get similar visual effects on a much lower budget [Henry Kidman] built a large, very fast camera slider. As is usually the case with such projects, it’s harder than it seems.
The original GlamBOT has a full 6 degrees of freedom, but many of the shots it’s famous for are just a slightly curved path between two points. That curve adds a few zeros to the required budget, so a straight slider was deemed good enough for [Henry]’s purposes. The first remaining challenge is speed. V1 one used linear rails made from shower curtain rails, with 3D printed sliders driven by a large stepper motor via a belt. The stepper motor wasn’t powerful enough to achieve the desired acceleration, so [Henry] upgraded to a more powerful 6 hp servo motor.
Unfortunately, the MDF and 3D-printed frame components were not rigid enough for the upgraded torque. It caused several crashes into the ends of the frame as the belt slipped and failed to stop the camera platform. The frame was rebuilt from steel, with square tubing for the rails and steel plates for the brackets. It provided the required rigidity, but the welding had warped the rails which led to a bumpy ride for the camera so he had to use active stabilization on the gimbal and camera. This project was filled with setback and challenges, but in the end the results look very promising with great slow motion shots on a mock red carpet.
How do you collect a lot of data about the ionosphere? Well, you could use sounding rockets or specialized gear. Or maybe you can just conscript a huge number of cell phones. That was the approach taken by Google researchers in a recent paper in Nature.
The idea is that GPS and similar navigation satellites measure transit time of the satellite signal, but the ionosphere alters the propagation of those signals. In fact, this effect is one of the major sources of error in GPS navigation. Most receivers have an 8-parameter model of the ionosphere that reduces that error by about 50%.
However, by measuring the difference in time between signals of different frequencies, the phone can estimate the total electron current (TEC) of the ionosphere between the receiver and the satellite. This requires a dual-frequency receiver, of course.
As impractical as most overclocking of computers is these days, there is still a lot of fun to be had along the way. Case in point being [Pieter-Jan Plaisier]’s recent liquid nitrogen-aided overclocking of an unsuspecting Raspberry Pi 5 and its BCM2712 SoC. Previous OCing attempts with air cooling by [Pieter] had left things off at a paltry 3 GHz from the default 2.4 GHz, with the power management IC (PMIC) circuitry on the SBC turning out to be the main limiting factor.
The main change here was thus to go for liquid nitrogen (LN2) cooling, with a small chipset LN2 pot to fit on the SBC. Another improvement was the application of a NUMA (non-uniform memory addressing) patch to force the BCM2712’s memory controller to utilize better RAM chip parallelism.
With these changes, the OC could now hit 3.6 GHz, but at 3.7 GHz, the system would always crash. It was time to further investigate the PMIC issues.
The PMIC imposes voltage configuration limitations and turns the system off at high power consumption levels. A solution there was to replace said circuitry with an ElmorLabs AMPLE-X1 power supply and definitively void the SBC’s warranty. This involves removing inductors and removing solder mask to attach the external power wires. Yet even with these changes, the SoC frequency had trouble scaling, which is why an external clock board was used to replace the 54 MHz oscillator on the PCB. Unfortunately, this also failed to improve the final overclock.
We covered the ease of OCing to 3 GHz previously, and no doubt some of us are wondering whether the new SoC stepping may OC better. Regardless, if you want to get a faster small system without jumping through all those hoops, there are definitely better (and cheaper) options. But you do miss out on the fun of refilling the LN2 pot every couple of minutes.
According to the Sapir–Whorf hypothesis, our language influences how we think and experience the world. That’s easy to imagine. Certainly our symbolism of mathematics influences how we calculate. Can you imagine doing moderately complex math with Roman numerals or without zero or negative numbers? But recently I was reminded that technological media also influences our perception of reality, and I have a Hackaday post to thank for it.
The post in question was about color TV. When I was a kid, most people had black and white TVs, although there were color sets. Even if you had a color set, many shows and movies were in black and white. Back then, many people still shot black and white film in their cameras, too, for many reasons. To make matters worse, I grew up in a small town, reading books from the local library that were ten or twenty years behind the times.
At some point, I read a statistic that said that most people dream in black and white. You may find this surprising, as I’ll bet you dream in color. It turns out, how people dream may have changed over the years and still and motion photography may be the reason.
The Post
In the post, I posed a question I’ve thought about many times: Did people dream in black and white before the advent of photography? It was kind of an off-hand remark to open the post, but many people reacted to it in the comments. They seemed surprised that I would ask that because, of course, everyone dreams in color.
I asked a few people I knew who also seemed very surprised that I would assume anyone ever dreams in color. But I was sure I had been told that sometime in the past. Time to hit the Internet and find out if that was incorrect or a false memory or something else. Turns out, it was indeed something else.
The Science
A scientific paper from 2008 held the answer. It turns out that science started asking questions like this in the early 1900s. Up through the 1940s, people overwhelmingly reported dreaming in black and white, at least most of the time. Color dreams were in the minority, although not unheard of.
Then something changed. Studies that occurred in the 1960s and later, show exactly the opposite. People almost always dream in color and rarely in black and white. Of course, that correlates well with the rise of color photos, movies, and television. What’s more is, while there is no scientific evidence gathering about earlier times, there is a suspicious lack of, for example, a Shakespeare quote about “The gray world of slumber…” or anything else that would hint that the writer was dreaming in black and white.
Interpretation
Judging from the paper, it seems clear that most people agree that color media played a role in this surprising finding. What they can’t agree on is why. It does seem unlikely that your dreams really change based on your media consumption. But it is possible that your recollection changes. This is particularly true since the way researchers acquired data changed over that time period, too. But even if the data doesn’t show that you dreamed in black and white, it did show that you remembered dreaming in black and white.
For that matter, it isn’t clear that anyone understands how you experience dreams visually, anyway. It isn’t like the back of your eyelids are little movie screens. You don’t actually see anything in a dream, you only remember seeing it.
The Question
If something as simple as black-and-white movies and TV can change how we perceive dreams, you have to wonder how much tech is changing our reality experience in other ways. Do we live differently because we have cell phones? Or the Internet? Will virtual reality alter our dream lives? It would be interesting to fast-forward a century and see what historians say about our time and how strangely we perceive reality today.
Early computer kits aimed at learning took all sorts of forms, from full-fledged computer kits like the Altair 8800 to the ready-made MicroBee Computer-In-A-Book. For those just wanting to dip their toes in the computing world, many low-cost computer “trainers” were released, and Japan had some awesome ones. [Jason Jacques] shows off his Gakken Micro-Computer FX-System (or is it the FX-Computer? Or maybe the FX-Micom? It seems like they couldn’t make up their minds). In any event, it was a combination microcomputer and I/O building blocks system running a custom version of the Texas Instrument TMS1100 microprocessor. Specifically designed to introduce users to the world of computing, the included guide is very detailed and includes 100 example programs and lots of information on how all the opcodes work.
This 4-bit system is similar to the Kenbak computer, with a very simple instruction set and limited address space. However, adding electronic components in plastic blocks brings this machine to a new level of interactivity. Connections can be made to and from the microcomputer block, as well as to the on-board speaker and simple input/output pins. The example circuit displayed on the front cover of the box enables the microcontroller to connect to the speaker and allows a switch to light up a small incandescent bulb. We can imagine many users wiring up all sorts of extra components to their FX-Computers, and with the advent of 3D printing, it wouldn’t be difficult to create new blocks to insert into the grid.