Eye vs. camera – Michael Mauser


Watch the center of this disk.
You are getting sleepy.
No, just kidding.
I’m not going to hypnotize you.
But are you starting to see colors in the rings?
If so, your eyes are playing tricks on you.
The disk was only ever black and white.
You see, your eyes don’t always capture the world as a video camera would.
In fact, there are quite a few differences,
owing to the anatomy of your eye
and the processing that takes place in your brain
and its outgrowth, the retina.
Let’s start with some similarities.
Both have lenses to focus light and sensors to capture it,
but even those things behave differently.
The lens in a camera moves to stay focused on an object hurtling towards it,
while the one in your eye responds by changing shape.
Most camera lenses are also achromatic,
meaning they focus both red and blue light to the same point.
Your eye is different.
When red light from an object is in focus, the blue light is out of focus.
So why don’t things look partially out of focus all the time?
To answer that question,
we first need to look at how your eye and the camera capture light:
photoreceptors.
The light-sensitive surface in a camera only has one kind of photoreceptor
that is evenly distributed throughout the focusing surface.
An array of red, green and blue filters on top of these photoreceptors
causes them to respond selectively to long, medium and short wavelength light.
Your eye’s retinas, on the other hand, have several types of photoreceptors,
usually three for normal light conditions, and only one type for lowlight,
which is why we’re color blind in the dark.
In normal light, unlike the camera, we have no need for a color filter
because our photoreceptors already respond selectively
to different wavelengths of light.
Also in contrast to a camera,
your photoreceptors are unevenly distributed,
with no receptors for dim light in the very center.
This is why faint stars seem to disappear when you look directly at them.
The center also has very few receptors that can detect blue light,
which is why you don’t notice the blurred blue image from earlier.
However, you still perceive blue there
because your brain fills it in from context.
Also, the edges of our retinas have relatively few receptors
for any wavelength light.
So our visual acuity and ability to see color
falls off rapidly from the center of our vision.
There is also an area in our eyes called the blind spot
where there are no photoreceptors of any kind.
We don’t notice a lack of vision there
because once again, our brain fills in the gaps.
In a very real sense, we see with our brains, not our eyes.
And because our brains, including the retinas,
are so involved in the process,
we are susceptible to visual illusions.
Here’s another illusion caused by the eye itself.
Does the center of this image look like it’s jittering around?
That’s because your eye actually jiggles most of the time.
If it didn’t, your vision would eventually shut down
because the nerves on the retina stop responding to a stationary image
of constant intensity.
And unlike a camera,
you briefly stop seeing whenever you make a larger movement with your eyes.
That’s why you can’t see your own eyes shift
as you look from one to the other in a mirror.
Video cameras can capture details our eyes miss,
magnify distant objects
and accurately record what they see.
But our eyes are remarkably efficient adaptations,
the result of hundreds of millions of years
of coevolution with our brains.
And so what if we don’t always see the world exactly as it is.
There’s a certain joy to be found watching stationary leaves
waving on an illusive breeze,
and maybe even an evolutionary advantage.
But that’s a lesson for another day.
2 views

답글 남기기

Shopping Cart
/classroom/
https://us06web.zoom.us/j/8929527091?pwd=VUpTS3ZvNWFSbnpWSXVodmJJQkpLdz09
https://talkingedu.com/product/course-registration/
https://us05web.zoom.us/j/8639616933?pwd=RuL0uabHDWy3mU7gjbTmmNJN5yigwA.1
https://open.kakao.com/o/sbFCSNEf