Maurice Brings Immersive Audio Recording To The Masses

Immersive audio is the new hotness in the recording world. Once upon a time, mono was good enough. Then someone realized humans have two ears, and everyone wanted stereo. For most of us, that’s where it stopped, but audio connoisseurs kept going into increasingly baroque surround-sound setups — ending in Immersive Audio, audio that is meant to fully reproduce the three-dimensional soundscape of the world around us. [DJJules] is one of those audio connoisseurs, and to share the joy of immersive audio recording with the rest of us, he’s developed Maurice, a compact, low-cost immersive microphone.

Maurice is technically speaking, a symmetrical ORTF3D microphone array. OTRF is not a descriptive acronym; it stands for Office de Radiodiffusion Télévision Française, the fine people who developed this type of microphone for stereo use. The typical stereo ORTF setup requires two cardioid microphones and angles them 110 degrees apart at a distance of 17 cm. Maurice arrays four such pairs, all oriented vertically and facing 90 degrees from one another for fully immersive, 8-channel sound. All of those microphones are thus arrayed to capture sound omnidirectionally, and give good separation between the channels for later reproduction. The mountings are all 3D printed, and [DJJules] kindly provides STLs.

This is the speaker setup you need to get full use of Maurice’s recordings. Now let’s see Paul Allen’s speakers.

Recording eight audio channels simultaneously is not trivial for the uninitiated, but fortunately, [DJJules] includes a how-to in his post. We particularly like his tip to use resistor color coding to identify the DIN cables for different microphone channels. Playback, too, requires special setup and processing. [DJJules] talks about listening on his 7.1.4 stereo setup, which you can find in a companion post. That’s a lot of speakers, as you might imagine.

There are high-end headphones that claim to reproduce an immersive sound field as well, but we can’t help but wonder if you’d miss the “true” experience without head tracking. Even with regular department-store headphones, the demo recordings linked via the Instructable sound great, but that probably just reflects the quality of the individual microphones.

Audio can be a make-or-break addition to VR experiences, so that would seem to be an ideal use case for this sort of technology. Maurice isn’t the only way to get there; we previously focused on [DJJules]’s ambisonic microphone, which is another way to reproduce a soundscape. What do you think, is this “immersive audio” the new frontier of Hi-Fi, or do we call it a stereo for a reason? Discuss in the comments!

Theremin-Style MIDI Controller Does It With Lasers

Strictly speaking, a Theremin uses a pair of antennae that act as capacitors in a specific R/C circuit. Looking at [aritrakdebnath2003]’s MIDI THEREMIN, we see it works differently, but it does play in the manner of the exotic radio instrument, so we suppose it can use the name.

The MIDI THEREMIN is purely a MIDI controller. It sends note data to a computer or synthesizer, and from there, you can get whatever sound at whatever volume you desire. The device’s brain is an Arduino Uno, and MIDI-out for the Arduino has been a solved problem for a long while now.

Continue reading “Theremin-Style MIDI Controller Does It With Lasers”

No Plans For The Weekend? Learn Raytracing!

Weekends can be busy for a lot of us, but sometimes you have one gloriously free and full of possibilities. If that’s you, you might consider taking a gander at [Peter Shirley]’s e-book “Learning Raytracing in One Weekend”.

This gradient is the first image that the book talks you through producing. It ends with the spheres.

This is very much a zero-to-hero kind of class: it starts out defining the PPM image format, which is easy to create and manipulate using nearly any language. The book uses C++, but as [Peter] points out in the introduction, you don’t have to follow along in that language; there won’t be anything unique to C++ you couldn’t implement in your language of choice.

There are many types of ray tracers. Technically, what you should end up with after the weekend ends is a path tracer. You won’t be replacing the Blender Cycles renderer with your weekend’s work, but you get some nice images and a place to build from. [Peter] manages to cram a lot of topics into a weekend, including diffuse materials, metals, dialectrics, diffraction, and camera classes with simple lens effects.

If you find yourself with slightly more time, [Peter] has you covered. He’s also released books on “Raytracing: The Next Week.” If you have a lot more time, then check out his third book, “Raytracing: The Rest of Your Life.”

This weekend e-book shows that ray-tracing doesn’t have to be the darkest of occult sciences; it doesn’t need oodles of hardware, either. Even an Arduino can do it..

The Most Personalized Font Is Your Own Handwriting

When making a personal website, one will naturally include a personal touch. What could be more personal than creating a font from your own handwriting? That’s what [Chris Smith] has done, and it looks great on his blog, which also has a post summarizing the process.

Like most of us [Chris] tried to use open-source toolkits first, but the workflow (and thus the result) was a bit wanting. Still, he details what it takes to create a font in Inkscape or Font Forge if anyone else wants to give it a try. Instead he ended up using a web app called Calligraphr designed for this exact use case.

Above is hand written; below is the font. Aside from the lighting the difference isn’t obvious.

Fair warning: the tool is closed-source and he needed to pay to get all the features he wanted — specifically ligatures, glyphs made from two joined letters. By adding ligatures his personalized font gets a little bit of variation, as the ‘l’ in an ‘lf’ ligature (for example) need not be identical to the stand-alone ‘l’. In a case of “you get what you pay for” the process worked great and to the credit of the folks at Calligraphr, while it is Software-As-Service they offer a one-time payment for one month’s use of the “pro” features. While nobody likes SaS, that’s a much more user-friendly way to do it — or perhaps “least-user-hostile”.

All [Chris] had to do was write out and scan a few sheets that you can see above, while the software handled most of the hard work automagically. [Chris] only had to apply a few tweaks to get the result you see here. Aside from websites, we could see a personalized font like this being a nice touch to laser cut, CNC or even 3D printed projects. If you don’t want a personalized touch, the “Gorton” lettering of retro machinery might be more to your liking.

The Decisioninator Decides Dinner, Saves Marriage

For something non-explosive, this might be the most American project we’ve featured in a while. [Makerinator]’s domestic bliss was apparently threatened by the question “what shall we have for dinner”– that’s probably pretty universal. Deciding that the solution was automation is probably universal to software devs and associated personalities the world over. That the project, aptly called “The Decisioninator” apes a popular game-show mechanic to randomly select a fast-food restaurant? Only people with 100-octanes of freedom running through their veins can truly appreciate its genius.

In form factor, it’s a tiny slot machine which [Makerinator] fabbed up on his laser cutter. The lovely “paintjob” was actually a print out with dye-sublimation ink that was transferred to plywood before laser cutting.  Mounted to this are illuminated arcade buttons and a small ISP display. The interface is simplicity itself: the big button spins a virtual “wheel” on the display (with sound effects inspired by The Price is Right) to tell the family what deliciously unhealthy slop they’ll be consuming, while the other button changes decision modes. Of course you can pick more than just dinner with The Decisioninator. You need only decide what spinners to program. Which, uh, that might be a problem.

Luckily [Makerinator] was able to come up with a few modes without recursively creating a The Decisioninator-inator. He’s got the whole thing running on a Pi4, which, with its 1980s supercomputer performance, is hilariously overpowered for the role it plays (in true American fashion). He’s coded the whole thing in the Flame Engine, which is a game engine built on the Flutter UI toolkit by American technology giant Google.

What’s more American than tech giants and fast food? A propane powered plasma cannon, for one thing; or maybe mental gymnastics to translate into freedom units, for another.

Thanks to [Makerinator] for the tip.

Figure 7-8, caption: Example thrust sheet rotation using tether control. Credit: NASA/James Bickford.

TFINER Is An Atompunk Solar Sail Lookalike

It’s not every day we hear of a new space propulsion method. Even rarer to hear of one that actually seems halfway practical. Yet that’s what we have in the case of TFINER, a proposal by [James A. Bickford] we found summarized on Centauri Dreams by [Paul Gilster] .

TFINER stands for Thin-Film Nuclear Engine Rocket Engine, and it’s a hoot.  The word “rocket” is in the name, so you know there’s got to be some reaction mass, but this thing looks more like a solar sail. The secret is that the “sail” is the rocket: as the name implies, it hosts a thin film of nuclear materialwhose decay products provide the reaction mass. (In the Phase I study for NASA’s Innovative Advanced Concepts office (NIAC), it’s alpha particles from Thorium-228 or Radium-228.) Alpha particles go pretty quick (about 5% c for these isotopes), so the ISP on this thing is amazing. (1.81 million seconds!) Continue reading “TFINER Is An Atompunk Solar Sail Lookalike”

One Camera Mule To Rule Them All

A mule isn’t just a four-legged hybrid created of a union betwixt Donkey and Horse; in our circles, it’s much more likely to mean a testbed device you hang various bits of hardware off in order to evaluate. [Jenny List]’s 7″ touchscreen camera enclosure is just such a mule.

In this case, the hardware to be evaluated is camera modules– she’s starting out with the official RPi HQ camera, but the modular nature of the construction means it’s easy to swap modules for evaluation. The camera modules live on 3D printed front plates held to the similarly-printed body with self-tapping screws.

Any Pi will do, though depending on the camera module you may need one of the newer versions. [Jenny] has got Pi4 inside, which ought to handle anything. For control and preview, [Jenny] is using an old first-gen 7″ touchscreen from the Raspberry Pi foundation. Those were nice little screens back in the day, and they still serve well now.

There’s no provision for a battery because [Jenny] doesn’t need one– this isn’t a working camera, after all, it’s just a test mule for the sensors. Having it tethered to a wall wart or power bank is no problem in this application. All files are on GitHub under a CC4.0 license– not just STLs, either, proper CAD files that you can actually make your own. (SCAD files in this case, but who doesn’t love OpenSCAD?) That means if you love the look of this thing and want to squeeze in a battery or add a tripod mount, you can! It’s no shock that our own [Jenny List] would follow best-practice for open source hardware, but it’s so few people do that it’s worth calling out when we see it.

Thanks to [Jenny] for the tip, and don’t forget that the tip line is open to everyone, and everyone is equally welcome to toot their own horn.