Here’s some fascinating work presented at SIGGRAPH 2023 of a method for radiance field rendering using a novel technique called Gaussian Splatting. What’s that mean? It means synthesizing a 3D scene from 2D images, in high quality and in real time, as the short animation shown above shows.
Neural Radiance Fields (NeRFs) are a method of leveraging machine learning to, in a way, do what photogrammetry does: synthesize complex scenes and views based on input images. But NeRFs work in a fraction of the time, and require only a fraction of the source material. There are different ways to go about this and unsurprisingly, there tends to be a clear speed vs. quality tradeoff. But as the video accompanying this new work seems to show, clever techniques mean the best of both worlds.
A short video summary is embedded just below the page break. Interested in deeper details? The research PDF is here. The amount of development this field has seen is nothing short of staggering, and certainly higher in quality than what was state-of-the-art for NeRFs only a year ago.
When debugging something as involved as kernel scheduler timings, you would typically use one of the software-based debugging mechanisms available. However, in cases when software is close to bare metal, you don’t always need to do that. Instead, you can output a signal to a GPIO, and then use a logic analyzer or a scope to measure signal change timing – which is what [Albert David] did when evaluating Linux kernel’s PREEMPT_RT realtime operation patches.
When you reach for a realtime kernel, latency is what you care about – realtime means that for everything you do, you need to get a response within a certain (hopefully very short) interval. [Albert] wrote a program that reads a changing GPIO input and immediately writes the new state back, and scoped both of the signals to figure out the latency of of the real-time patched kernel as it processes the writes. Overlaying all the incoming and outgoing signals on the same scope screen, you can quickly determine just how suitable a scheduler is when it comes to getting an acceptable response times, and [Albert] also provides a ready-to-go BeagleBone image you can use for your own experiments, or say, in an educational environment.
What could you use this for? A lot of hobbyists use realtime kernels on Linux when building CNC machine controllers and robots, where things like motor control put tight constraints on how quickly a decision in your software is translated into real-world consequences, and if this sounds up your valley, check out this Linux real-time task tutorial from [Andreas]. If things get way too intense for a multi-tasking system like Linux, you might want to use a RTOS to begin with, and we have a guide on that for you, too.
[Armstrong] has a lot of good points, although we aren’t sure you need the complexity of a real-time operating system just to squeeze a bag. If anything, that seems like it might make it more susceptible to unexpected operation. However, we agree with his comments that you should have closed-loop control to make sure the device is working, alarming when the device isn’t working, and watchdog timers to guard against lockup.
Last year a team of researchers published a paper detailing a method of boosting visual contrast and image quality in stereoscopic displays. The method is called Dichoptic Contrast Enhancement (DiCE) and works by showing each eye a slightly different version of an image, tricking the brain into fusing the two views together in a way that boosts perceived image quality. This only works on stereoscopic displays like VR headsets, but it’s computationally simple and easily implemented. This trick could be used to offset some of the limitations of displays used in headsets, for example making them appear capable of deeper contrast levels than they can physically deliver. This is good, because higher contrasts are generally perceived as being more realistic and three-dimensional; important factors in VR headsets and other stereoscopic displays.
Stereoscopic vision works by having the brain fuse together what both eyes see, and this process is called binocular fusion. The small differences between what each eye sees mostly conveys a sense of depth to us, but DiCE uses some of the quirks of binocular fusion to trick the brain into perceiving enhanced contrast in the visuals. This perceived higher contrast in turn leads to a stronger sense of depth and overall image quality.
To pull off this trick, DiCE displays a different contrast level to both eyes in a way designed to encourage the brain to fuse them together in a positive way. In short, using a separate and different dynamic contrast range for each eye yields an overall greater perceived contrast range in the fused image. That’s simple in theory, but in practice there were a number of problems to solve. Chief among them was the fact that if the difference between what each eyes sees is too great, the result is discomfort due to binocular rivalry. The hard scientific work behind DiCE came from experimentally determining sweet spots, and pre-computing filters independent of viewer and content so that it could be applied in real-time for a consistent result.
Things like this are reminders that we experience the world only through the filter of our senses, and our perception of reality has quirks that can be demonstrated by things like this project and other “sensory fusion” edge cases like the Thermal Grill Illusion, which we saw used as the basis for a replica of the Pain Box from Dune.
Zephyr is an open source real-time operating system (RTOS) that appeared on the scene a few years ago with support for a few boards. The new 1.11 release adds a lot of features, a lot of new boards, and also has a Windows development environment. But don’t worry, the environment is portable so it still runs on Linux and Mac, as well.
The OS has support for many ARM and x86 boards. It also supports ESP32, NIOS II, and can also target Linux which is useful for debugging or studying execution using desktop tools.
[Pete] wondered how real-time clock modules could be selling on eBay for $1.50 when the main component, the Maxim DS3231 RTC/TCXO chip, cost him more like $4 apiece. Could the cheap modules contain counterfeit chips?
Well, sure they could. But in this case, they didn’t, and [Pete] has the die shots to prove it. He started off by clipping the SOIC leads rather than desoldering — he’s not going to be reusing this chip after he’s cut it in half. Next was a stage of embrittling the case by heating it up with a lighter and dunking it in water. Then he went at it with sandpaper.
It’s cool. You can see the watch crystal inside, and all of the circuitry. The DS3231 includes a TCXO — temperature-corrected crystal oscillator — and it seems to have a bank of capacitors that it connects and disconnects depending on the chip’s temperature to keep the oscillator running at the right speed. [Pete] used one in an offline situation, and it only lost sixteen seconds over a year, so we’d say that they work fine.
Space. The final frontier. Unfortunately, the vast majority of us are planet-locked until further notice. If you are dedicated hobbyist astronomer, you probably already have the rough positions of the planets memorized. But what if you want to know them exactly from the comfort of your room and educate yourself at the same time? [Shubham Paul] has gone the extra parsec to build a Real-Time Planet Tracker that calculates their locations using Kepler’s Laws with exacting precision.
An Arduino Mega provides the brains, while 3.5-turn-pan and 180-degree-tilt servos are the brawn. A potentiometer and switch allow for for planet and mode selection, while a GPS module and an optional MPU9250 gyroscope/magnetometer let it know where you are. Finally a laser pointer shows the planet’s location in a closed room. And then there’s code: a lot of code.
The hardware side of things — as [Shubham Paul] clarifies — looks a little unfinished because the focus of the project is the software with the intent to instruct. They have included all the code they wrote for the RTPT, providing a breakdown in each section for those who are looking to build their own.