Before I read this phrase, I was about to comment something along the lines of “Facebook, and other tech companies, are no strangers to generating galaxy-brained arguments to persuade investors/analyst firms that their company’s future is more valuable than it actually is”. After I read that phrase I completely flipped.
Anyone else with sufficient real-life experience analyzing this industry would also flip when they see a phrase like this one.
Can you say more? are you saying you’re flipped out (presumably an expression of irritation) because it gives perfect eye fixation data for tracking emotions? or perhaps, I’m personally excited for eye tracked mouse input, maybe that’s the kind of thing you meant too? I’m not sure which parse to use heh
I strongly doubt eye-tracked mouse input is going to be good enough for anyone to want (if they don’t have a disability that makes them unable to use their hands). The three reasons for eye tracking are for augmenting the inverse optics (chromatic aberration correction in particular works better if you have millimeter-precise information about where on the user’s face the headset is sitting), for foveated rendering (uncertain value, but in the best case might effectively quadruple your GPU speed), and for letting other people in a conversation see your eye movements (very important for getting past the uncanny valley).
Lots of people look at Oculus and tell stories about how this is supposed to support Facebook’s advertising business in some creepy way or another. I think you can refute this just by looking at revenue numbers: the Oculus platform gets a 30% cut of game revenue (similar to Steam), and this is quite a lot of money per user.
chromatic aberration correction in particular works better if you have millimeter-precise information about where on the user’s face the headset is sitting
You have officially blown my mind. I seriously cannot believe that AI can subtly mess with video color in real time based on known effect on eye movement, that is absolutely nuts and the applications are limitless in the short- and long-term. No wonder Apple stock keeps going up, there’s probably all sorts of things like that which I’m not aware of.
Before I read this, I thought I was cool for knowing that oscillating adaptive refresh rate could yield known measurable effects e.g. while shopping online. That’s nothing compared to what chromatic aberration can do. Thank you very much for sharing, my career has benefited profoundly by me learning this.
foveated rendering (uncertain value, but in the best case might effectively quadruple your GPU speed)
I’m definitely not an expert in this area, but I can’t imagine this being possible unless the headset was hardwired to a data center or something. Have we really gotten to the point where that much ML can fit on a gaming PC?
Lots of people look at Oculus and tell stories about how this is supposed to support Facebook’s advertising business in some creepy way or another.
Do tell! Although I definitely agree with you that it’s extremely cheap to generate large numbers of totally-false rumors on this specific topic, and extremely expensive to verify most of them.
You have officially blown my mind. I seriously cannot believe that AI can subtly mess with video color in real time based on known effect on eye movement, that is absolutely nuts and the applications are limitless in the short- and long-term. No wonder Apple stock keeps going up, there’s probably all sorts of things like that which I’m not aware of.
This isn’t an AI thing at all, it’s an optics thing.
One of the core problems of VR optics is that the panels emit three different wavelengths of light (R, G and B), and these bend differently when they pass through the lenses. If you naively display an image without correcting this, you wind up with red, green, and blue partial images that have separated from each other. In order to fix this problem, you predict the lens effect, apply the opposite effect to the drawn image, and have the distortions cancel. The problem is that the headset’s position on your face is imprecise, and if you shift the headset a millimeter in any direction, the R, G and B images (as perceived by the eye) move in different directions. If you’re trying to display black-on-white or white-on-black text, moving the color channels a pixel apart has a major effect on readability.
Before I read this, I thought I was cool for knowing that oscillating adaptive refresh rate could yield known measurable effects e.g. while shopping online. That’s nothing compared to what chromatic aberration can do. Thank you very much for sharing, my career has benefited profoundly by me learning this.
This paragraph is profoundly confused in a way that I can’t fathom.
foveated rendering (uncertain value, but in the best case might effectively quadruple your GPU speed)
I’m definitely not an expert in this area, but I can’t imagine this being possible unless the headset was hardwired to a data center or something. Have we really gotten to the point where that much ML can fit on a gaming PC?
Again, nothing to do with ML. Foveated rendering is a fancy way of saying “don’t spend GPU cycles drawing parts of the screen that the user isn’t looking at”. It only works if you have an eye-tracking camera that tells you which part of the screen the user is looking at.
Before I read this phrase, I was about to comment something along the lines of “Facebook, and other tech companies, are no strangers to generating galaxy-brained arguments to persuade investors/analyst firms that their company’s future is more valuable than it actually is”. After I read that phrase I completely flipped.
Anyone else with sufficient real-life experience analyzing this industry would also flip when they see a phrase like this one.
And Facebook knows this.
Can you say more? are you saying you’re flipped out (presumably an expression of irritation) because it gives perfect eye fixation data for tracking emotions? or perhaps, I’m personally excited for eye tracked mouse input, maybe that’s the kind of thing you meant too? I’m not sure which parse to use heh
I strongly doubt eye-tracked mouse input is going to be good enough for anyone to want (if they don’t have a disability that makes them unable to use their hands). The three reasons for eye tracking are for augmenting the inverse optics (chromatic aberration correction in particular works better if you have millimeter-precise information about where on the user’s face the headset is sitting), for foveated rendering (uncertain value, but in the best case might effectively quadruple your GPU speed), and for letting other people in a conversation see your eye movements (very important for getting past the uncanny valley).
Lots of people look at Oculus and tell stories about how this is supposed to support Facebook’s advertising business in some creepy way or another. I think you can refute this just by looking at revenue numbers: the Oculus platform gets a 30% cut of game revenue (similar to Steam), and this is quite a lot of money per user.
You have officially blown my mind. I seriously cannot believe that AI can subtly mess with video color in real time based on known effect on eye movement, that is absolutely nuts and the applications are limitless in the short- and long-term. No wonder Apple stock keeps going up, there’s probably all sorts of things like that which I’m not aware of.
Before I read this, I thought I was cool for knowing that oscillating adaptive refresh rate could yield known measurable effects e.g. while shopping online. That’s nothing compared to what chromatic aberration can do. Thank you very much for sharing, my career has benefited profoundly by me learning this.
I’m definitely not an expert in this area, but I can’t imagine this being possible unless the headset was hardwired to a data center or something. Have we really gotten to the point where that much ML can fit on a gaming PC?
Do tell! Although I definitely agree with you that it’s extremely cheap to generate large numbers of totally-false rumors on this specific topic, and extremely expensive to verify most of them.
This isn’t an AI thing at all, it’s an optics thing.
One of the core problems of VR optics is that the panels emit three different wavelengths of light (R, G and B), and these bend differently when they pass through the lenses. If you naively display an image without correcting this, you wind up with red, green, and blue partial images that have separated from each other. In order to fix this problem, you predict the lens effect, apply the opposite effect to the drawn image, and have the distortions cancel. The problem is that the headset’s position on your face is imprecise, and if you shift the headset a millimeter in any direction, the R, G and B images (as perceived by the eye) move in different directions. If you’re trying to display black-on-white or white-on-black text, moving the color channels a pixel apart has a major effect on readability.
This paragraph is profoundly confused in a way that I can’t fathom.
Again, nothing to do with ML. Foveated rendering is a fancy way of saying “don’t spend GPU cycles drawing parts of the screen that the user isn’t looking at”. It only works if you have an eye-tracking camera that tells you which part of the screen the user is looking at.