A DR-DOS console showing the IDLE command

Missing DR-DOS Power Management Source Code Found In Patent

Modern processors come with all kinds of power management features, which you don’t typically notice as a user until you start a heavy program and hear the CPU fan spin up. Back in the early 1990s however, power management was largely unheard of, meaning that a CPU with nothing to do would run through an idle loop that dissipated about as much power as a real computing task. [Michal Necasek] noticed this while experimenting with DR-DOS 6.0 in a virtual machine – his laptop fan would start running on full blast whenever he opened the VM. His search for a solution to this annoyance led him down a fascinating journey into the intricacies of DOS power management.

As it turned out, DR-DOS 6.0 does have functionality built in for putting the CPU in power saving mode when it’s idle. This feature is not complete, however: Digital Research required each computer manufacturer to develop an IDLE driver customized to their specific hardware platform in order to enable power management. Sadly, no manufacturer ever bothered to do so, leaving [Michal] with no option other than writing a driver himself. While there was some documentation available, it didn’t include any example code or sufficient detail to write a driver from scratch.

A snippet of x86 assembly code found in a patentWhat it did include was a reference to U.S. Patent No. 5,355,501. Normally this sort of information is of interest only to those planning to sell a competing system, but this specific patent happens to include dozens of pages of well-documented but poorly-scanned x86 assembly code, including source code for a basic IDLE86.SYS driver. As [Michal] wasn’t looking forward to chasing bugs caused by OCR errors, he simply copied the source code by hand, then ran it through an assembler. The end result was a working IDLE driver, which is now available for download from his website.

[Michal]’s blog post also includes lots of details on early power saving implementations, including all the DOS interrupt calls involved in the process. Patents might seem boring in contrast, but they sometimes contain surprising amounts of usable information. You might find enough details to reverse-engineer a wireless protocol, or even to help track down an obscure instrument’s original designer.

Anodizing Titanium In Multiple Colors

[Titans of CNC Machining] wanted to anodize some titanium parts. They weren’t looking for a way to make the part harder or less prone to corrosion. They just wanted some color. As you can see in the video below, the resulting setup is much simpler than you might think.

The first attempt, however, didn’t work out very well. The distilled water and baking soda was fine, as was the power supply made of many 9V batteries. But a copper wire contaminated the results. The lesson was that you need electrodes of the same material as your workpiece.

Continue reading “Anodizing Titanium In Multiple Colors”

Hackaday Links Column Banner

Hackaday Links: May 14, 2023

It’s been a while since we heard from Dmitry Rogozin, the always-entertaining former director of Roscosmos, the Russian space agency. Not content with sending mixed messages about the future of the ISS amid the ongoing war in Ukraine, or attempting to hack a mothballed German space telescope back into action, Rogozin is now spouting off that the Apollo moon landings never happened. His doubts about NASA’s seminal accomplishment apparently started while he was still head of Roscosmos when he tasked a group with looking into the Apollo landings. Rogozin’s conclusion from the data his team came back with isn’t especially creative; whereas some Apollo deniers go to great lengths to find “scientific proof” that we were never there, Rogozin just concluded that because NASA hasn’t ever repeated the feat, it must never have happened.

Continue reading “Hackaday Links: May 14, 2023”

Hackaday Prize 2023: Eye-Tracking Wheelchair Interface Is A Big Help

For those with quadriplegia, electric wheelchairs with joystick controls aren’t much help. Typically, sip/puff controllers or eye-tracking solutions are used, but commercial versions can be expensive. [Dhruv Batra] has been experimenting with a DIY eye-tracking solution that can be readily integrated with conventional electric wheelchairs.

The system uses a regular webcam aimed at the user’s face. A Python script uses OpenCV and a homebrewed image segmentation algorithm to analyze the user’s eye position. The system is configured to stop the wheelchair when the user looks forward or up. Looking down commands the chair forward. Glancing left and right steers the chair in the given direction.

The Python script then sends the requisite commands via a TCP connection to an ESP32, which controls a bunch of servos to move the wheelchair’s joystick in the desired manner. This allows retrofitting the device on a wheelchair without having to modify it in an invasive manner.

It’s a neat idea, though it could likely benefit from some further development. A reverse feature would be particularly important, after all. However, it’s a great project that has likely taught [Dhruv] many important lessons about human-machine interfaces, particularly those beyond the ones we use every day. 

This project has a good lineage as well — a similar project, EyeDriveOMatic won the Hackaday prize back in 2015.

Industrial Robot Gets Open-Source Upgrade

Industrial robots are shockingly expensive when new, typically only affordable for those running factories of some sort. Once they’ve gone through their life cycle building widgets, they can be purchased for little more than scrap value, which is essentially free compared to their original sticker price. [Excessive Overkill] explains all of this in a video where he purchased one at this stage to try to revive, but it also shows us how to get some more life out of these robots if you can spend some time hunting for spare parts, installing open-source firmware, and also have the space for a robot that weighs well over a thousand kilograms.

This specific robot is a Fanuc R2000ia with six degrees of freedom and a reach of over two meters. Originally the plan was to patch together a system that could send modern gcode to the Fanuc controller, but this was eventually scrapped when [Excessive Overkill] realized the controller that shipped with this robot was for an entirely different machine and would never work. Attempts to find upgraded firmware were frustrated, and after a few other false starts a solution was found to get the robot working again using LinuxCNC and Mesa FPGA cards, which have built-in support for Fanuc devices like this.

More after the break…

Continue reading “Industrial Robot Gets Open-Source Upgrade”

A Guard Bot For Your Home Assistant

While fixed sensors, relays, and cameras can be helpful in monitoring your home, there are still common scenarios you need to physically go and check something. Unfortunately, this is often the case when you’re away from home. To address this challenge, [PriceLessToolkit] created a guardian bot that can be controlled through Home Assistant.

The robot’s body is made from 3D printed components designed to house the various modules neatly. The ESP32 camera module provides WiFi and video capabilities, while the Arduino Pro Mini serves as the bot’s controller. Other peripherals include a light and radar sensor, an LED ring for status display, and a speaker for issuing warnings to potential intruders. The motor controllers are salvaged from two 9-gram servos. The onboard LiPo battery can be charged wirelessly with an integrated charging coil and controller by driving the bot onto a 3D printed dock.

This build is impressive in its design and execution, especially considering how messy it can get when multiple discrete modules are wired together. The rotating caster wheels made from bearings add an elegant touch.

If you’re interested in building your own guard bot, you can find the software, CAD models, and schematics on GitHub. If you’re looking to add other gadgets to your Home Assistant setup, we’ve seen it connect to boilers, blinds, beds and 433 MHz sensors.

Continue reading “A Guard Bot For Your Home Assistant”

Artemis II Laser Communications

Artemis II Will Phone Home From The Moon Using Laser Beams

[NASA] Astronauts will be testing the Orion Artemis II Optical Communications System (O2O) to transmit live, 4K ultra-high-definition video back to Earth from the Moon. The system will also support communication of images, voice, control channels, and enhanced science data.

Aboard Orion, the space terminal includes an optical module, a modem, and a control system.  The optical module features a four inch telescope on a dual gimbal mount. The modem modulates digital information onto laser beams for transmission back to Earth, and demodulates data from laser beams recieved from Earth. The control system interfaces with avionic systems aboard Orin to control and point the communications telescope.

On Earth, facilities including the Jet Propulsion Laboratory and the White Sands Complex will maintain high-bandwidth optical communication links with Orion. Information received from Orion will be relayed to mission operations, scientists, and researchers.

NASA’s Laser Communications Relay Demonstration (LCRD) showcases the benefits of optical communications.  Traditionally, missions relied upon radio communication, but improved technology will better serve space missions that generate and collect ever-increasing quantities of data. Optical communication solutions can provide 10 to 100 times the bandwidth of radio frequency systems. Other improvements may include increased link distances, higher efficiency, reduced interference, improved security, and reductions in size and weight. Our Brief History of Optical Communication outlines many of these advantages.

Continue reading “Artemis II Will Phone Home From The Moon Using Laser Beams”