Previous | Next --- Slide 30 of 65
Back to Lecture Thumbnails

Question: During class Keenan asked a question about why do objects look smaller when they are viewed at a distance. I liked one of the arguments made because it appealed to the angle subtended by an object. Could someone elaborate on that here?


1.) v = 1/z(from later slides) as z->INf, v->0. Here we can view the eye as a pin hole camera. 2.) Coming to the angles, if we think the camera center as a center of a circle and the length of the object as a chord of the circle. We can draw many circles having this chord as a chord to the circle however the farther the distance of the chord from the center of the circle we choose smaller the angle. Now that we agreed upon for centers very far from the object the angle subtended by the object at this center is small and having known that our pinhole camera(eye) is still the same i.e the width of the cardboard is still the same. Now we can construct simple right angle triangle inside the cardboard passing through the center of camera we can derive the length of the image formed is directly proportional to the tangent of the angle and hence the length of the object would be negligible with large distances.


Let's assume z-axis is the optical axis and the pinhole is the origin $O$.
For any point $p(x_i,y_i,z_i)$ on the tree, $$\angle zOp = \arctan \frac{\sqrt{x_i^2+y_i^2}}{z_i} $$ If we move it a bit further from camera and get $p_1(x_i,y_i,z_i+d)$, $$\angle zOp_1 = \arctan \frac{\sqrt{x_i^2+y_i^2}}{z_i+d} $$ Apparently, $$ \angle zOp > \angle zOp_1 $$ So, as tree moves away from camera, every point on the tree will form smaller angle with the optical axis.

Assume the distance between the image plane and the pinhole is $f$ and the intersection of the optical axis and the image plane is $O'$ and point $p$ is projected to $q$.
Because $\triangle qOO'$ is a right-angled triangle and $\angle qOO' = \angle zOp$, $$ O'q = f\tan\angle zOp $$ So as $\angle zOp$ is getting smaller, the distance between projected point $q$ and the optical axis becomes smaller too.


Another perspective on the question: as an object moves away from a lens, less light from the object is hitting the lens (it's a 1/r^2 rule). However, objects do not appear any less bright as they move away. So if an object is to appear the same brightness as it provides less light, that light must be concentrated in a smaller area in your vision.

Of course if an object is too small to resolve (like a star) it will actually get less bright as it moves away.