There isn’t any mainstream AR product to judge against because it’s a much more challenging technology. Proper AR keeps the real world unobstructed and overlays virtual objects; Hololens and Magic Leap would be the closest to that which are available so far. I do not consider piped-in cameras like will be on the Quest Pro to be the same. Eyestrain will likely in better AR for two reasons. One, it would simply be the real world in regular vision for most experiences, so no adjustment is required. Secondly, unlike VR which is effectively two close-up screens to focus on, current AR innovation involves clear, layered reflective lenses that actually orient the individual light rays to match the path it would take to your eye if the object was actually in that 3d space. So instead of a close image that your brain can be convinced is distant, the light itself hits the retina at the proper angle to be registered as actually at that distance. Presumably, this would be less strenuous on the eyes and image processing, but it’s still experimental.
Depends on the tech. A lot of AR involves putting a camera on VR goggles, and piping the digital image onto VR screens. So while you may be looking at the real world, you’re looking at a low-res, pixelated, fixed-focal-distance, no-peripheral-vision, sweaty, god-rays version of it.
There are versions of AR that function more like a heads up display. I cannot speak from personal experience, but my understanding is that they still have issues:
How’s eyestrain with AR?
There isn’t any mainstream AR product to judge against because it’s a much more challenging technology. Proper AR keeps the real world unobstructed and overlays virtual objects; Hololens and Magic Leap would be the closest to that which are available so far. I do not consider piped-in cameras like will be on the Quest Pro to be the same. Eyestrain will likely in better AR for two reasons. One, it would simply be the real world in regular vision for most experiences, so no adjustment is required. Secondly, unlike VR which is effectively two close-up screens to focus on, current AR innovation involves clear, layered reflective lenses that actually orient the individual light rays to match the path it would take to your eye if the object was actually in that 3d space. So instead of a close image that your brain can be convinced is distant, the light itself hits the retina at the proper angle to be registered as actually at that distance. Presumably, this would be less strenuous on the eyes and image processing, but it’s still experimental.
Depends on the tech. A lot of AR involves putting a camera on VR goggles, and piping the digital image onto VR screens. So while you may be looking at the real world, you’re looking at a low-res, pixelated, fixed-focal-distance, no-peripheral-vision, sweaty, god-rays version of it.
There are versions of AR that function more like a heads up display. I cannot speak from personal experience, but my understanding is that they still have issues:
https://arstechnica.com/gadgets/2022/10/microsoft-mixed-reality-headsets-nauseate-soldiers-in-us-army-testing/