Making An Arduino Ventilator? Read This First

Thanks to the virus crisis, lots of people are designing makeshift ventilator designs in the hopes of saving people’s lives. Many of these are based around some sort of Arduino-powered CPU. [Armstrong Subero] things that’s a great idea, but cautions that making an electronic pair of dice is a different proposition than creating a machine to breathe for someone. But he isn’t just complaining. He talks about considerations when building a real-time and safety-critical system.

[Armstrong] has a lot of good points, although we aren’t sure you need the complexity of a real-time operating system just to squeeze a bag. If anything, that seems like it might make it more susceptible to unexpected operation. However, we agree with his comments that you should have closed-loop control to make sure the device is working, alarming when the device isn’t working, and watchdog timers to guard against lockup.

Continue reading “Making An Arduino Ventilator? Read This First”

Tricking The Brain Into Seeing Boosted Contrast In Stereo Imagery

Last year a team of researchers published a paper detailing a method of boosting visual contrast and image quality in stereoscopic displays. The method is called Dichoptic Contrast Enhancement (DiCE) and works by showing each eye a slightly different version of an image, tricking the brain into fusing the two views together in a way that boosts perceived image quality. This only works on stereoscopic displays like VR headsets, but it’s computationally simple and easily implemented. This trick could be used to offset some of the limitations of displays used in headsets, for example making them appear capable of deeper contrast levels than they can physically deliver. This is good, because higher contrasts are generally perceived as being more realistic and three-dimensional; important factors in VR headsets and other stereoscopic displays.

Stereoscopic vision works by having the brain fuse together what both eyes see, and this process is called binocular fusion. The small differences between what each eye sees mostly conveys a sense of depth to us, but DiCE uses some of the quirks of binocular fusion to trick the brain into perceiving enhanced contrast in the visuals. This perceived higher contrast in turn leads to a stronger sense of depth and overall image quality.

Example of DiCE-processed images, showing each eye a different dynamic contrast range. The result is greater perceived contrast and image quality when the brain fuses the two together.

To pull off this trick, DiCE displays a different contrast level to both eyes in a way designed to encourage the brain to fuse them together in a positive way. In short, using a separate and different dynamic contrast range for each eye yields an overall greater perceived contrast range in the fused image. That’s simple in theory, but in practice there were a number of problems to solve. Chief among them was the fact that if the difference between what each eyes sees is too great, the result is discomfort due to binocular rivalry. The hard scientific work behind DiCE came from experimentally determining sweet spots, and pre-computing filters independent of viewer and content so that it could be applied in real-time for a consistent result.

Things like this are reminders that we experience the world only through the filter of our senses, and our perception of reality has quirks that can be demonstrated by things like this project and other “sensory fusion” edge cases like the Thermal Grill Illusion, which we saw used as the basis for a replica of the Pain Box from Dune.

A short video overview of the method is embedded below, and a PDF of the publication can be downloaded for further reading. Want a more hands-on approach? The team even made a DiCE plugin (freely) available from the Unity asset store.

Continue reading “Tricking The Brain Into Seeing Boosted Contrast In Stereo Imagery”

Zephyr Adds Features, Platforms, And Windows

Zephyr is an open source real-time operating system (RTOS) that appeared on the scene a few years ago with support for a few boards. The new 1.11 release adds a lot of features, a lot of new boards, and also has a Windows development environment. But don’t worry, the environment is portable so it still runs on Linux and Mac, as well.

The OS has support for many ARM and x86 boards. It also supports ESP32, NIOS II, and can also target Linux which is useful for debugging or studying execution using desktop tools.

Continue reading “Zephyr Adds Features, Platforms, And Windows”

Get Inside A TCXO Clock Chip

[Pete] wondered how real-time clock modules could be selling on eBay for $1.50 when the main component, the Maxim DS3231 RTC/TCXO chip, cost him more like $4 apiece. Could the cheap modules contain counterfeit chips?

Well, sure they could. But in this case, they didn’t, and [Pete] has the die shots to prove it. He started off by clipping the SOIC leads rather than desoldering — he’s not going to be reusing this chip after he’s cut it in half. Next was a stage of embrittling the case by heating it up with a lighter and dunking it in water. Then he went at it with sandpaper.

It’s cool. You can see the watch crystal inside, and all of the circuitry. The DS3231 includes a TCXO — temperature-corrected crystal oscillator — and it seems to have a bank of capacitors that it connects and disconnects depending on the chip’s temperature to keep the oscillator running at the right speed. [Pete] used one in an offline situation, and it only lost sixteen seconds over a year, so we’d say that they work fine.

If you’d like to know more about how crystals are used to keep time, check out [Jenny]’s excellent article. And if sixteen second per year is way too much for you, tune up your rubidium standard and welcome to the world of the time nuts.

Real-Time Planet Tracker With Laser-Point Accuracy

Space. The final frontier. Unfortunately, the vast majority of us are planet-locked until further notice. If you are dedicated hobbyist astronomer, you probably already have the rough positions of the planets memorized. But what if you want to know them exactly from the comfort of your room and educate yourself at the same time? [Shubham Paul] has gone the extra parsec to build a Real-Time Planet Tracker that calculates their locations using Kepler’s Laws with exacting precision.

An Arduino Mega provides the brains, while 3.5-turn-pan and 180-degree-tilt servos are the brawn. A potentiometer and switch allow for for planet and mode selection, while a GPS module and an optional MPU9250 gyroscope/magnetometer let it know where you are. Finally a laser pointer shows the planet’s location in a closed room. And then there’s code: a lot of code.

The hardware side of things — as [Shubham Paul] clarifies — looks a little unfinished because the focus of the project is the software with the intent to instruct. They have included all the code they wrote for the RTPT, providing a breakdown in each section for those who are looking to build their own.

Continue reading “Real-Time Planet Tracker With Laser-Point Accuracy”

Retrotechtacular: The Early Days Of CGI

We all know what Computer-Generated Imagery (CGI) is nowadays. It’s almost impossible to get away from it in any television show or movie. It’s gotten so good, that sometimes it can be difficult to tell the difference between the real world and the computer generated world when they are mixed together on-screen. Of course, it wasn’t always like this. This 1982 clip from BBC’s Tomorrow’s World shows what the wonders of CGI were capable of in a simpler time.

In the earliest days of CGI, digital computers weren’t even really a thing. [John Whitney] was an American animator and is widely considered to be the father of computer animation. In the 1940’s, he and his brother [James] started to experiment with what they called “abstract animation”. They pieced together old analog computers and servos to make their own devices that were capable of controlling the motion of lights and lit objects. While this process may be a far cry from the CGI of today, it is still animation performed by a computer. One of [Whitney’s] best known works is the opening title sequence to [Alfred Hitchcock’s] 1958 film, Vertigo.

Later, in 1973, Westworld become the very first feature film to feature CGI. The film was a science fiction western-thriller about amusement park robots that become evil. The studio wanted footage of the robot’s “computer vision” but they would need an expert to get the job done right. They ultimately hired [John Whitney’s] son, [John Whitney Jr] to lead the project. The process first required color separating each frame of the 70mm film because [John Jr] did not have a color scanner. He then used a computer to digitally modify each image to create what we would now recognize as a “pixelated” effect. The computer processing took approximately eight hours for every ten seconds of footage. Continue reading “Retrotechtacular: The Early Days Of CGI”

A Tutorial On Using Linux For Real-Time Tasks

[Andreas] has created this tutorial on real-time (RT) tasks in Linux. At first blush that sounds like a rather dry topic, but [Andreas] makes things interesting by giving us some real-world demos using a Raspberry Pi and a stepper motor. Driving a stepper motor requires relatively accurate timing. Attempting to use a desktop operating system for a task like this is generally ill-advised. Accurate timing is best left to a separate microcontroller. This is why we often see the Raspi paired with an Arduino here on Hackaday. The rationale behind this is not often explained.

[Andreas] connects a common low-cost 28BYJ-48 geared stepper motor with a ULN2003 driver board to a Raspberry Pi’s GPIO pins. These motors originally saw use moving the louvers of air conditioners. In general, they get the job done, but aren’t exactly high quality. [Andreas] uses a simple program to pulse the pins in the correct order to spin the motor. Using an oscilloscope, a split screen display, and a camera on the stepper motor, [Andreas] walks us through several common timing hazards, and how to avoid them.

The most telling hazard is shown last. While running his stepper program, [Andreas] runs a second program which allocates lots of memory. Eventually, Linux swaps out the stepper program’s memory, causing the stepper motor to stop spinning for a couple of seconds. All is not lost though, as the swapping can be prevented with an mlockall() call.

The take away from this is that Linux is not a hard real-time operating system. With a few tricks and extensions, it can do some soft real-time tasks. The best solution is to either use an operating system designed for real-time operation, or offload real-time operations to a separate controller.

Continue reading “A Tutorial On Using Linux For Real-Time Tasks”