Being Diffraction Limited

Luckily this subject makes no dramatic screen play even though some writings about it in the internet make it seem so.

First of all: what is diffraction? Wikipedia explains it quite thoroughly. Please read there and look at animated examples. To explain how light works, light must be partly explained as waves and partly as tiny particles, photons. Light is counted as photons in camera´s image sensor. Light behaves like waves when it interacts with the diaphragm inside lens. Common sense would say that tiny particles would either go straight through the slit or hit it and bounce off. Common sense is wrong, diaphragm is a slit which makes light waves bend and causes diffraction. Actually the whole lens and/or every single lens element of the lenses causes diffraction but the smallest opening is the most important. The smaller the opening the more we have diffraction. This makes the diaphragm the major culprit in most cases. 

Wikipedia shows how diffraction (see Airy disk) is directly dependent on the wavelength of light and the f-number used. The bigger the f-number the smaller the aperture and the more diffraction. The longer the wavelength (ie redder light) the more diffraction. In spectrum the opposite short wavelength end is blue and causes least diffraction. In the middle we have yellow-green.

Wikipedia shows lots of equations but in practical photography you need only remember those two: aperture and color of light. Red spreads more than blue. And the fact that diffraction is a property of every lens.

Diffraction limited bodies?

In the internet there are furious debates on camera bodies being diffraction dependent or diffraction limited. How come when diffraction is property of the lens? This an issue which has risen when megapixel counts get bigger. The more megapixels in a sensor the smaller the sensor elements are. Well actually it is the opposite: we get more megapixels with smaller sensor elements, sensels or pixels, if you want. We have long ago passed the point when Airy disk can cover several sensels. Lets say we have two very small light sources side by side. They can´t be seen as two on sensor if those Airy patterns overlap enough. Two becomes first a rod with two blobs at the ends and then with even smaller apertures those blobs grow together. No resolution left, more pixels inside that blob does not help.

The theory of diffraction alarmists is that more megapixels is bad because diffraction is seen at ever larger apertures and this will make such cameras useless for most of photography. It was long ago when I mentioned to bosses at Canon that sensors are coming eventually diffraction limited. My point was that in commercial photography, like in table tops or product shoots, we need to use small apertures to get enough depth of field. They must make sure that the gain in sensor resolution and increasing bit depth does not cause problems here. The answer was that there is no such thing as a sensor being diffraction limited, never heard of. Oh well...

An example of diffraction

There are hundreds of blogs and thousands of discussions on Nikon D800 and how it is diffraction limited. "You should not use smaller aperture than..." Yes, Nikon D800 is a good example to bring something tangible into this discussion, which now becomes one of those hundreds and thousands. Actually I wanted to test Nikon D800E and its 36 MP sensor just to see how good it really is. It was sort of reality test to see where things are going. As a sidekick we can check these Nikon D800E test images as an example on how diffraction works:

D800 F1.4 crop.jpg

f/1.4

D800 F2.0 crop.jpg

f/2.0

D800 F2.8 crop.jpg

f/2.8

D800 F4.0 crop.jpg

f/4.0

D800 F5.6 crop.jpg

f/5.6

D800 F8.0 crop.jpg

f/8.0

D800 F11 crop.jpg

f/11

D800 F16 crop.jpg

f/16

Looking at the series above you can see how they get sharper up to f/4.0 and then start to get softer after f/5.6. I really can´t pick the better one from f/4.0 and f/5.6. Now, diffraction is there all the time, but it is not the only working parameter in optics. Other gains make the image better when diaphragm closes down untill diffraction becomes too strong. I have prepared these examples so that lightest areas are the same in all images. By doing so diffraction eats from dark areas and you can see how black lines get weaker. I could correct a lot of it in postprocessing but I wanted to show how diffraction works. These crops are 100% and from the center of image. The lens here is the relatively new Nikon 35mm f/1.4 G AF-S. The pattern seen in people silhouettes is moiré. Nikon D800E is the second camera I have tested which is able to show it. The first was my PhaseOne P45+ digital back with the better Mamiya 645 lenses. This combination is so good that you can see this moiré pattern even wide open. Why am I speaking about moiré as a good sign? It is because those areas are printed with a very tight raster. Only the sharpest lenses and sensors can see that there is a pattern. D800E can´t resolve the pattern itself yet, no one-shot consumer camera can from my shooting distance. By f/16 the lens can not convey even a hint of this pattern to sensor because Airy patterns overlap so strongly.

Being Diffraction Limited

To get a better idea how deeply (or not) D800 is diffraction limited is to compare it to D700, which has 12 megapixels. They both have a full 35mm size sensor, which means that D700 has bigger sensor elements and it should not be as much diffraction limited as D800. Right?

D800 vs D700 F5.6.jpg

This comparison image shows D800E above and D700 below. Aperture is at f/5.6 to be on the safe side for D700. Again D800E is at native 100% and I have enlarged D700 image to same size by using Photoshop Bicubic Smoother interpolation. This is pretty much what happens if you want to print a D700 image as large as a D800E image. Well, this tells us only that D800E has more pixels and this lens can show it. How ever, the difference is not coming from megapixels alone as D700 has in front of sensor a low pass filter which softens images while D800E doesn´t. (Actually also D800E has a certain layer structure because of which this could be argued, but it really is of no concern here.)

D800 vs D700 F16.jpg

Same situation, but now my aperture was at f/16. To give D700 a slight advantage I have corrected its contrast, but even then there´s no question which camera shows more detail. It is easy to see that D700 has not suffered as much from diffraction, it is not as much diffraction limited. But this is only because D700 is more limited to start with. Diffraction is a relative thing, it is a gradual thing. There is no Diffraction Limit™ which hits your images and makes them suddenly unusable. But there also is no superhero called Diffraction Buster. You have learn to live with diffraction, and preferably, learn to correct what it smudges in post process when you need to close down because of need for DOF.

Rules of thumb

When you think about image quality you should always think about the combination of lens and body. As a simplification (image quality) = (lens quality) * (body quality). Body quality meaning sensor plus conversion to final image. Here the lens quality must drop dramatically before the difference between these bodies becomes obscured. As a rule of thumb we could say that diffraction starts to show in images when you have closed aperture down a full stop from the size (number) of pixel pitch. Nikon D800 sensor´s pixel pitch is 4.88 µm. This tells us that we can go beyond f/5.6 to almost f/8, just like is seen above. And then again we must remember that diffraction is caused by the lens. With slower lenses, like a zoom having f/5.6 as largest aperture, image quality usually gets only better when stopped down by one stop. But there are also lenses, especially with P&S cameras, which should always be used wide open, if you think only about image quality as seen here. Know your equipment!

I will come back to D800E and discuss OM-D with it in a later blog.     

-p-