Apple Vision Pro’s Secret To Smooth Visuals? Subtly Substandard Optics

The displays inside the Apple Vision Pro have 3660 × 3200 pixels per eye, but veteran engineer [Karl Guttag]’s analysis of its subtly blurred optics reminds us that “resolution” doesn’t always translate to resolution, and how this is especially true for things like near-eye displays.

The Apple Vision Pro lacks the usual visual artifacts (like the screen door effect) which result from viewing magnified pixelated screens though optics. But [Karl] shows how this effect is in fact hiding in plain sight: Apple seems to have simply made everything just a wee bit blurry thanks to subtly out-of-focus lenses.

The thing is, this approach of intentionally de-focusing actually works very well for consuming visual content like movies or looking at pictures, where detail and pixel-to-pixel contrast is limited anyway.

Clever loophole, or specification shenanigans? You be the judge of that, but this really is evidence of how especially when it comes to things like VR headsets, everything is a trade-off. Improving one thing typically worsens others. In fact, it’s one of the reasons why VR monitor replacements are actually a nontrivial challenge.

32 thoughts on “Apple Vision Pro’s Secret To Smooth Visuals? Subtly Substandard Optics

  1. I wonder if that blur is why some users are reporting the Apple Vision Pro is more tiring than other VR headsets on the eyes than their usual VR setup. It would make some sense to me, with the eye trying to focus on things that don’t actually have a hard edge but should. But I’m not enough of an expert on the squashy biology bits to know.

    1. I remember as a kid I was told not to sit too close to the TV because it will hurt your eyes… So apple stick a screen to your face and it hurts your eyes… Shocker … Think different

      1. Assuming you to not be yet another troll…
        The difference here is the Apple is reportedly nicer looker but much harder on the eyes than other HMD at least for some folks – and these are ones who can and have worn other HMD for many hours and not really noticed it beyond their physical weight, sweat collecting etc.

      1. Shhhh. They’ll come up with a specific anecdotal use case of a pre-civil war PC running a mail server via steam-fusion to prove it’s possible for hardware to be used for longer than Apples typical hardware lifespan.

  2. >detail and pixel-to-pixel contrast

    A single pixel by itself is not “detail” and has no contrast. The actual “resolution” in terms of drawing distinct points is always less than the density of pixels. The square edges of pixels are just artifacts that create an illusion of “sharpness”, much like the hiss of a vinyl record creates the illusion of clarity or whatever audiophile adjective you use.

    When you blur out the edges of pixels, that IS the actual information content of the picture. If it looks fuzzy, that’s because there isn’t any more information to be seen.

      1. While the pixels are small enough to actually display the intended information having the grid being visible if you look for it won’t be a real problem I’d think. It certainly isn’t much of an issue on the original Vive (I own one) with its 9 pixels a degree (probably generous there) and really really easy to spot pixel grid.

        Folks are good at interpreting incomplete pictures and with ever moving eyes and head and lots of frames a second from those different perspectives the details that are not exactly crisply rendered or visible in a single still shot are ‘there’ enough – Good enough for flying simulators etc though obviously not for playing normal monitor. So even though sharper detail would be really nice even at that low quality the effect is good enough the artifacts of such a low quality HMD don’t exactly get lost but cease to really matter as much – at least while the content you are trying to display is actually in a reasonable range for that HMD.

        So if as I suggest above the slight blur is why some folks are finding the AVP more fatiguing on their eyes I’d think the compromises of being more able to see the pixel grid are worth it for some folks at least.

        1. There is no advantage in seeing the grid, because any content, be it video or computer graphics, need to be anti-aliased to avoid other worse artifacts, and that reduces the actual information density to such a degree that seeing a single pixel is no longer relevant.

          The “sharpness” that comes from seeing pixel edges is an illusion – it’s visual noise. If you want that, you could add a diffusing layer a’la old Super-IPS monitors that creates these tiny “sparkles” that look like there’s barely visible detail there. The advantage there is that the pattern is random, so you avoid the screen-door effect.

          1. This has nothing to do with the information density and very little to with the content displayed really. It is pure speculation on my part beyond my experience that having the pixel grid being visible doesn’t actually matter much to VR in general, as all that rendering so many frames from different perspectives as you move means your brain stitches the details together rather well painting a clearer picture than that single still frame of render can. Which is how that original Vive with its rather paltry pixel count spread across a pretty large FOV can actually be used for anything.

            But as many VR users have reportedly found the Apple causes more eyestrain than other VR sets I am theorizing it may be caused by this ‘blur’ upsetting the eyes as they forever try to keep focusing but can’t find the right position. In which case having the pixels more clearly visible by removing this deliberate blur maybe will allow the eye to set the right focal length and just stay there and so be less fatiguing to wear. A tradeoff to the ‘prettiness’ of the display for less eyestrain so you actually use the headset more than an hour or two would for most folks be worth it I’d suggest.

          2. Take a mosquito net and wrap it in a circle around your head, then look at stuff. You can ignore it for as long as you keep moving your head, but when you stop it becomes painfully apparent that you’re looking at the world through a grid.

          3. Don’t have to really ‘move’ move for it to be fine for most of the sort of VR content though – that natural turning your head even the slightest amount to view anything more than a few degrees of dead center and the active intent to interact with the content and not just looking for the limits of the HMD goes a long way to making those pixels being visible irrelevant.

            Obviously only so far that goes, its not going to turn the really low resolution into something fantastic for reading the sharpest details, but when you do have pretty good resolution in this AVP and many of the better current HMD being able to see the pixels isn’t going to be a big problem at all I’d suggest. When you can look past it quite easily on the earliest HMD as long as the content isn’t soo small it doesn’t really render you can do it on the new ones that have a way denser pixel grid. And if seeing the pixels properly fixes the seeming eyestrain problem some big VR users are finding in the AVP that is worth it IMO – there is no point to a display that objectively looks nicer if you can’t actually use it.

    1. And optical low pass filters are the anti-aliasing put into most cameras to prevent this effect, but you can get some which don’t have it, so real world comparison images should be out there.

  3. I don’t believe it’s the lenses that are out of “focus”, because the eye can vary its “focus” too, and the effect would disappear at some offset level and feel like strain when they naturally try to compensate. I use “focus” in quotes because the eyes and the headset lenses are a continuous optical path, and therefore they don’t have separate foci.

    …..Also, this effect isn’t new for Apple…

    Years ago, before Apple had it’s “retina” display, I wondered why their displays tended to look smooth and relatively detailed, compared to other similarly detailed displays. I looked closely at the screen and noticed that the pixel edges were blurred out… There must have been a diffuser layer applied in such a way to provide just enough, but not too much blurring, and it was pleasing to look at. The pixels in a jagged line could just barely be made out, but otherwise it had just the right amount of blurring that all the detail was still displayed.
    I’m pretty sure Apple is still doing that, and that’s what’s visible in this headset, because it makes more sense than doing it with lenses.
    The only way defocusing via lenses makes sense, is if the image is projected onto a screen, and that screen is what the eyes are focused on, no matter what is done in the optical path prior. As above, I think a few of us have done that with a projector to good effect.

    1. Just to add, because I lost the chance to edit the above, I noticed that many color CRT displays were pleasing to look at when their focus spot was just larger than a pixel and so the scan lines and pixels were difficult to make out, but not enough to blend into each other 2 pixels away.

    2. > the pixel edges were blurred out

      That was common for Super-IPS monitors back around 2004. Cheaper monitors didn’t have it. I had one HP monitor, and still use an old Dell with that pattern, while my newer HP doesn’t have it – mainly because it makes the monitor dimmer and loses contrast.

  4. I was aware of this effect when contemplating watching video with close head optics which I tried back in ’77. I used 8 X 40 extra wide field binoculars on a mic boom stand in a recliner with a 19″ color TV across the room. No wonder why Apple wants your eye prescription when getting one.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.