Apple Vision Pro’s Secret To Smooth Visuals? Subtly Substandard Optics

The displays inside the Apple Vision Pro have 3660 × 3200 pixels per eye, but veteran engineer [Karl Guttag]’s analysis of its subtly blurred optics reminds us that “resolution” doesn’t always translate to resolution, and how this is especially true for things like near-eye displays.

The Apple Vision Pro lacks the usual visual artifacts (like the screen door effect) which result from viewing magnified pixelated screens though optics. But [Karl] shows how this effect is in fact hiding in plain sight: Apple seems to have simply made everything just a wee bit blurry thanks to subtly out-of-focus lenses.

The thing is, this approach of intentionally de-focusing actually works very well for consuming visual content like movies or looking at pictures, where detail and pixel-to-pixel contrast is limited anyway.

Clever loophole, or specification shenanigans? You be the judge of that, but this really is evidence of how especially when it comes to things like VR headsets, everything is a trade-off. Improving one thing typically worsens others. In fact, it’s one of the reasons why VR monitor replacements are actually a nontrivial challenge.

An Optical Computer Architecture

We always hear that future computers will use optical technology. But what will that look like for a general-purpose computer? German researchers explain it in a recent scientific paper. Although the DOC-II used optical processing, it did use some conventional electronics. The question is, how can you construct a general computer that uses only optical technology?

The paper outlines “Miller’s criteria” for practical optical logic gates. In particular, any optical scheme must provide outputs suitable for introduction to another gate’s inputs and also support fan out of one output to multiple inputs. It is also desirable that each stage does not propagate signal degradation and isolate its outputs from its inputs. The final two criteria note that practical systems don’t depend on loss for information representation since this isn’t reliable across paths, and, similarly, the gates should require high-precision adjustment to work correctly.

The paper also identifies many misconceptions about new computing devices. For example, they assert that while general-purpose desktop-class CPUs today contain billions of devices, use a minimum of 32-bits of data path, and contain RAM, this isn’t necessarily true for CPUs that use different technology. If that seems hard to believe, they make their case throughout the paper. We can’t remember the last scientific paper we read that literally posed the question, “Will it run Doom?” But this paper does actually propose this as a canonical question.

Continue reading “An Optical Computer Architecture”

This Unique Flip-Flop Uses Chemistry And Lasers

One of the first logic circuits most of us learn about is the humble flip-flop. They’re easy enough to build with just a couple of NOR or NAND gates, and even building one up from discrete components isn’t too much of a chore. But building a flip-flop from chemicals and lasers is another thing entirely.

That’s the path [Markus Bindhammer] took for his photochromic molecular switch. We suspect this is less of an attempt at a practical optical logic component and more of a demonstration project, but either way, it’s pretty cool. Photochromism is the property by which molecules reversibly rearrange themselves and change color upon exposure to light, the most common example being glass that darkens automatically in the sun. This principle can be used to create an optical flip-flop, which [Markus] refers to as an “RS” type but we’re pretty sure he means “SR.”

The electronics for this are pretty simple, with two laser modules and their drivers, a power supply, and an Arduino to run everything. The optics are straightforward as well — a beam splitter that directs the beams from each laser onto the target, which is a glass cuvette filled with a clear epoxy resin mixed with a photochromic chemical. [Markus] chose spiropyran as the pigment, which when bathed in UV light undergoes an intramolecular carbon-oxygen bond breakage that turns it into the dark blue pigment merocyanine. Hitting the spot with a red laser or heating the cuvette causes the C-O bond to reform, fading the blue spot.

The video below shows the intensely blue dot spot developing under UV light and rapidly fading thanks to just the ambient temperature. To make the effect last longer, [Markus] cools the target with a spritz from a CO2 cartridge. We imagine other photochromic chemicals could also be employed here, as could some kind of photometric sensor to read the current state of the flip-flop. Even as it is, though, this is an interesting way to put chemistry and optics to work.

Continue reading “This Unique Flip-Flop Uses Chemistry And Lasers”

Seeing Fireworks In A Different Light

If you’re worried that [Roman Dvořák]’s spectroscopic analysis of fireworks is going to ruin New Year’s Eve or the Fourth of July, relax — the science of this build only adds to the fun.

Not that there’s nothing to worry about with fireworks, of course; there are plenty of nasty chemicals in there, and we can say from first-hand experience that getting hit in the face and chest with shrapnel from a shell is an unpleasant experience. [Roman]’s goal with this experiment is pretty simple: to see if it’s possible to cobble together a spectrograph to identify the elements that light up the sky during a pyrotechnic display. The camera rig was mainly assembled from readily available gear, including a Chronos monochrome high-speed camera and a 500-mm telescopic lens. A 100 line/mm grating was attached between the lens and the camera, a finding scope was attached, and the whole thing went onto a sturdy tripod.

From a perch above Prague on New Year’s Eve, [Roman] collected a ton of images in RAW12 format. The files were converted to TIFFs by a Python script and converted to video by FFmpeg. Frames with good spectra were selected for analysis using a Jupyter Notebook project. Spectra were selected by moving the cursor across the image using slider controls, converting pixel positions into wavelengths.

There are some optical improvements [Roman] would like to make, especially in aiming and focusing the camera; as he says, the dynamic and unpredictable nature of fireworks makes them difficult to photograph. As for identifying elements in the spectra, that’s on the to-do list until he can find a library of spectra to use. Or, there’s always DIY Raman spectroscopy. Continue reading “Seeing Fireworks In A Different Light”

Tattoo-Removal Laser Brought Out Of Retirement For A Megawatt Of Fun

We’ve got to say that [Les Wright] has the most fun on the internet, at least in terms of megawatts per dollar. Just look at his new video where he turns a $30 eBay tattoo-removal laser into a benchtop beast.

The junk laser in question is a neodymium:YAG pulse laser that clearly has seen better days, both externally and internally. The original pistol-grip enclosure was essentially falling apart, but was superfluous to [Les]’ plans for the laser. Things were better inside the business end of the gun, at least in terms of having all the pieces in place, but the teardown still revealed issues. Chief among these was the gunk and grunge that had accumulated on the laser rod and the flash tube — [Les] blamed this on the previous owner’s use of tap water for cooling rather than deionized water. It was nothing a little elbow grease couldn’t take care of, though. Especially since the rest of the laser bits seemed in good shape, including the chromium:YAG Q-switch, which allows the lasing medium to build up a huge pulse of photons before releasing them in one gigantic pulse.

Cleaned up and with a few special modifications of his own, including a custom high-voltage power supply, [Les]’ laser was ready for tests. The results are impressive; peak optical power is just over a megawatt, which is enough power to have some real fun. We’ll be keen to see what he does with this laser — maybe blasting apart a CCD camera?

Continue reading “Tattoo-Removal Laser Brought Out Of Retirement For A Megawatt Of Fun”

Making Your Own VR Headset? Consider This DIY Lens Design

Lenses are a necessary part of any head-mounted display, but unfortunately, they aren’t always easy to source. Taking them out of an existing headset is one option, but one may wish for a more customized approach, and that’s where [WalkerDev]’s homebrewed “pancake” lenses might come in handy.

Engineering is all about trade-offs, and that’s especially true in VR headset design. Pancake lenses are compact units that rely on polarization to bounce light around internally, resulting in a very compact assembly at the cost of relatively poor light efficiency. That compactness is what [WalkerDev] found attractive, and in the process discovered that stacking two different Fresnel lenses and putting them in a 3D printed housing yielded a very compact pancake-like unit that gave encouraging results.

This project is still in development, and while the original lens assembly is detailed in this build log, there are some potential improvements to be made, so stay tuned if you’re interested in using this design. A DIY headset doesn’t mean you also must DIY the lenses entirely from scratch, and this option seems economical enough to warrant following up.

Want to experiment with mixing and matching optics on your own? Not only has [WalkerDev]’s project shown that off-the-shelf Fresnel lenses can be put to use, it’s in a way good news that phone-based VR is dead. Google shipped over 10 million cardboard headsets and Gear VR sold over 5 million units, which means there are a whole lot of lenses in empty headsets laying around, waiting to be harvested and repurposed.

Souped-Up Reflective Sensor Uses Itself For Wireless Programming

Proximity sensors are common enough in automation projects that we hardly give them a second thought — pick something with specs that match the job and move on. But they can be fussy to get adjusted just right, a job made more difficult if they’re located in some out-of-the-way corner.

But where lies a challenge, there’s also an opportunity, as [Ido Gendel] shows us with this remote-controlled proximity sensor. The story behind this clever little hack starts with an off-the-shelf sensor, the kind with an IR LED and a phototransistor pointed in the same direction that gives a digital output when the light bouncing back into the phototransistor exceeds a certain threshold. It was setting the threshold that gave [Ido]’s client trouble, so [Ido] decided to build a programmable drop-in replacement to make the job easier.

The first try at this used an OBP732 reflective transmitter and an ATtiny202 microcontroller and had three pads on the PCB for programming. This still required physical contact for programming, though, so [Ido] had the idea to use the sensor for wireless IR programming. The microcontroller on version two was switched to an ATtiny212, and a couple of components were added to control the power of the LED so the sensor could do double duty. A programmer using the same sensor and a USB-to-UART adapter completes the system, and allows the sensor threshold to be set just by shining the programmer in its general direction from up to 25 cm away.

We think that getting multiple uses from a single sensor is pretty clever, so hats off for this one. It’s not the first time we’ve featured one of [Ido]’s projects, but it’s been quite a while — this one-clock-cycle-a-day Shabbat clock was the most recent, but you can clearly see the roots of the sensor project in this mouse pointer data encoder that goes all the way back to 2015.