Whenever I wear polarized lenses I can see patterns in safety-glass, and more bands on rainbows than would regularly be there; most other people I’ve met are similar.
One day, on a long car trip, I was talking to the guy sitting next to me and he was able to see these things with his eyes uncovered. I haven’t the faintest clue whether this is a hardware or a software difference, either seem feasible.
It’s very cool, but because it’s on the threshold of perception it also requires a good deal of discipline not to fall into an N-ray style state of mind when attempting to view them.
Maybe the people who can see those things with their eyes uncovered lack stereo vision?
Since I was a child I found that when I close one eye, light sources (against a sufficiently dark surroundings) change their appearance… Similar to a lensflare effect. Works with each eye individually, but with both eyes open these artifacts disappear. I always figured these are optical phenomena which will be identified as such by the brain by comparison between both eyes and therefore eliminated.
So if someone lacks stereo vision, or has a significant impairment of the stereo vision system, this might explain this polarizing phenomenon. However, maybe I’m in error and those two phenomena are apples and oranges.
Hm, I don’t think it’s likely a function of basic differences in visual perception—I have normal vision as far as I’m concerned, but I have very vivid mental imagery. I also have very vivid dreamscapes, and every dream I have is a new scape—I’ve never had the same one twice. (Unrelatedly or relatedly, I dream A LOT, even when I doze off for 5-10 minutes.) In any case, I can be physically looking at something in the real world, but be “looking” at something completely different in my mind’s eye, but there is a definite shift in attention that facilitates how much information I can get from either the current sensory input or the mental image.
This is more likely to be caused by a hardware difference than a software differnce, but both of these explanations seems really unlikely compared to the theory that this person’s self report was confused. If in a controlled experiment, he can reliably differentiate between patterns of light polarization, then I will worry about explaining this.
I would think hardware. Polarization isn’t something you can reconstruct from just color, but naturally-polarized lenses occur in nature and thus could have been produced by a mutation.
You’re thinking about this all wrong. It’s biological so the hardware IS the software.
A better question would be: is the difference in the eye or the brain? This you could test by taking some blue-detecting cones from the retinas of people who can and cannot detect Haidinger’s brush and see if they respond differently to changes in polarization.
My understanding is that all humans have the ‘hardware’ to see polarized light, but that most of us filter it out—that is, it is a software issue. However, you could also phrase this as ‘the eyes register the light, but the brain discards the information’.
Anyone can see those things if looking at reflection of blue sky. Blue sky’s light is polarized. Ditto if looking at a reflection.
But most people wouldn’t notice that, the effect is fairly faint.
The person who could detect polarized light would notice that LCD displays are polarized and could tell you some are polarized other way than others.
Whenever I wear polarized lenses I can see patterns in safety-glass, and more bands on rainbows than would regularly be there; most other people I’ve met are similar.
One day, on a long car trip, I was talking to the guy sitting next to me and he was able to see these things with his eyes uncovered. I haven’t the faintest clue whether this is a hardware or a software difference, either seem feasible.
Related: ever seen Haidinger’s brush?
It’s very cool, but because it’s on the threshold of perception it also requires a good deal of discipline not to fall into an N-ray style state of mind when attempting to view them.
Have you read about Haidinger’s brush?
Maybe the people who can see those things with their eyes uncovered lack stereo vision?
Since I was a child I found that when I close one eye, light sources (against a sufficiently dark surroundings) change their appearance… Similar to a lensflare effect. Works with each eye individually, but with both eyes open these artifacts disappear. I always figured these are optical phenomena which will be identified as such by the brain by comparison between both eyes and therefore eliminated.
So if someone lacks stereo vision, or has a significant impairment of the stereo vision system, this might explain this polarizing phenomenon. However, maybe I’m in error and those two phenomena are apples and oranges.
Hm, I don’t think it’s likely a function of basic differences in visual perception—I have normal vision as far as I’m concerned, but I have very vivid mental imagery. I also have very vivid dreamscapes, and every dream I have is a new scape—I’ve never had the same one twice. (Unrelatedly or relatedly, I dream A LOT, even when I doze off for 5-10 minutes.) In any case, I can be physically looking at something in the real world, but be “looking” at something completely different in my mind’s eye, but there is a definite shift in attention that facilitates how much information I can get from either the current sensory input or the mental image.
This is more likely to be caused by a hardware difference than a software differnce, but both of these explanations seems really unlikely compared to the theory that this person’s self report was confused. If in a controlled experiment, he can reliably differentiate between patterns of light polarization, then I will worry about explaining this.
I would think hardware. Polarization isn’t something you can reconstruct from just color, but naturally-polarized lenses occur in nature and thus could have been produced by a mutation.
You’re thinking about this all wrong. It’s biological so the hardware IS the software.
A better question would be: is the difference in the eye or the brain? This you could test by taking some blue-detecting cones from the retinas of people who can and cannot detect Haidinger’s brush and see if they respond differently to changes in polarization.
My understanding is that all humans have the ‘hardware’ to see polarized light, but that most of us filter it out—that is, it is a software issue. However, you could also phrase this as ‘the eyes register the light, but the brain discards the information’.
Anyone can see those things if looking at reflection of blue sky. Blue sky’s light is polarized. Ditto if looking at a reflection. But most people wouldn’t notice that, the effect is fairly faint. The person who could detect polarized light would notice that LCD displays are polarized and could tell you some are polarized other way than others.