DPanswers

essays | home

Image Defects

by Gisle Hannemyr

Below is samples of some of the more common image defects found in digital images.

Amp Noise

Amp noiseAmp noise is a noise pattern along the edge of the image frame produced by interference (typical heat) by compo­nents close to the image sensor. The image to the left show the characteristic amp noise pattern of the Nikon D80 on a 3 minute exposure with the lens cap on at ISO 1600, boosted with auto levels to make the noise pattern stand out. The noise pattern doesn't show up at “normal” exposure times (if it does the camera is defective), but will appear at long exposures (along with a number of hot photosites). This means that the Nikon D80 is not best suited for astronomical or long night exposure work. See also the discussion om amp noise in Thom Hogan's D80 review (scroll down the page about 60 % to the heading “Noise”.)

Banding

BandingBanding (also known as posterization and quantification noise) is a defect that is characterized by abrupt changes in tones or colour, rather than smooth transients.

In the image shown to the left, the obvious banding visible in the sky has been introduced deliberately, to demonstrate the effect. When banding occur as an unwanted image defect, the bands of colour may be less obvious.

When looking at the histogram of an image, banding will appear as gaps within the histogram.

Banding may have several different causes, such as:

The banding effect may also be introduced deliberately, for artistic effect, e.g. to imitate the look of pop-art posters.

Bayer Artefacts

Bayer artefactsBayer filter reconstruction are prone to rain­bow-like colour artefacts along the edge between any two contrasting colours. The image crop on the left shows a typical example of this type of image defect.

For an illustrated explanation of colour image recon­struc­tion, and an demonstration on how Bayer artefacts are created, please see the Wikipedia article on Bayer filters.

There exists a number of different programs that do Bayer reconstruction. These are often referred to as RAW-converters. Each RAW-converter uses its own, often proprietary, Bayer reconstruction algorithm. The choice of algorithm influences sharpness and presence of Bayer artefacts.

Bayer artefacts Bayer artefactsThis example, taken from Martin Brown's article on CCD Bayer Masks shows the effect of using different reconstruction algorithms. In the crop on the left, prominent Bayer artefacts are visble around the edge, and along the weather board, chimney and antenna. These are eliminated in the crop on the right, which is from the same RAW-file converted to TIFF using a different reconstruction algorithm.

Bayer artefacts Bayer artefactsHere is an even more dramatic example (source: Usenet post by frederick). The enlarged image of black letters on a white background on the left is converted from NEF (Nikon's RAW format) with an early version of David Coffin's DCRaw. This version of DCRaw used a rather simplistic one-size-fits-all Bayer reconstruction algorithm, and renders the letters in all the colours of the rainbow. The image on the right is a crop from the same NEF converted using Nikon Capture. Because Nikon Capture uses an algorithm adapted for the NEF format, it does a much better job of the conversion.

Black Pitting

A common problem with digital imaging is that long exposure times will produce so-calles “hot photosites”. As a countermeasure, many cameras use an in-built noise reduction technique called dark-frame subtraction (the camera makes a second exposure with the shutter closed of the same duration as the first, and then substracts the latter from the former). In the subtracted image, hot (coloured) pixels will become black, and this artefact is known as “black pitting”.

More sophisticated noise reduction, where a pixel originating from one or more hot photosites is replaced by a pixel computed from an interpolated average of the surrounding pixels, will remedy this defect.

Blooming

BloomingBlooming is a defect that is caused by oversaturation of the individual sensor sites (photosites or sensels) that make up a digital CCD image sensor.

CCD sensors are typically designed to allow easy vertical shifting of the charge. Potential barriers are created to reduce flow into neighbouring horizontal sensor sites. Hence the excess charge will preferentially flow into the nearest vertical neighbours. Blooming will therefore be more prominent in the vertical direction (in a landscape oriented photo). The excess charge spills into the neighbouring sensor sites creating overexposure in them as well.

In the example above, taken with a digital camera introduced in 2004, the vertical bar above and below the bright sun is caused by blooming.

Blooming only affects CCD sensors. CMOS sensors does not suffer from this defect. However, CCD sensor technology has improved since 2004. As a result, newer CCD-cameras are much less susceptible to blooming than older models.

To avoid blooming during the long exposures associated with night photography and astro­photography, you may instead take several shorter exposures, cutting the exposure time just before the brightest object has begun to bloom, and then combine (or “stack”) them into a single long exposure (e.g. eight 30 seconds exposure can be stacked to create one 4 minute exposure).

Chromatic Aberration

Chromatic Aberration (CA) is an optical distortion that caused by the camera's lens. Unlike most other defects listed on this page, CA will appear on images captured on film as well as on a digital sensor. However, because many digital sensors are sensitive to infrared light, CA can be more of a problem with digital than with film.

CA is a distortion where a pronouced colour distortion appears near edge transients in a photograph. Some people use the term purple fringing for CA, but the colour distortion can be any colour, not only purple.

CA occurs when the a point source containing components from more than one of the basic colours (Red, Green and Blue) does not focus at the same point on the surface of the sensor. Also, because many digital cameras are sensi­tive to infrared light, (which has a different focus point than visible light) CA may be caused by an infrared component exposing the sensor.

CA

One of the things that characterises CA is that the distortion exhibits a radial pattern. As shown in the example above, the off-colour fringe will be to one side of the transition at the left edge of the image, to the other side of transition at the left edge of the image, and centered around the transition at the center of the image.

The regular and radial pattern that characterises CA makes it sometimes possible to correct for CA in post-processing, by carefully realigning the three colour planes relative to each other. Tools that lets you do this is Helmut Dersch's excellent free software program, Panorama Tools, and DxO Optics Pro. Some cameras even feature built-in chromatic aberration correction.

CA is conspicuous in many of the cheap “kit” lenses that comes with entry level camera bodies at very little extra cost. More expensive lens designs, often built around the use aspherical elements and low-dispersion glass elements, is much less prone to CA.

Dust

Dust on the sensorGetting dust inside the mirror box of a DSLR with inter­changable lenses is unavoidable, even if you take great care when changing lenses in the field. Dust appears as darkish spots in areas with uni­form colour (e.g. grey or blue sky). It becomes more pronounced at shorter focal lengths and smaller apertures. The 100 % crop on the left shows a typical example of how dust on the sensor looks like.

To get rid of dust spots of the sensor, you need to clean it. A number of different cleaning met­hods, both wet and dry, is available.

Hot Photosites

Hot photositesA hot photosite (also referred to as hot pixel or defect pixel) appears in the photograph as a coloured cross. Usually it is in the colour of the colour filters that make up the sensor's Bayer matrix (red, green or blue), but if two or more neighbouring photosites become “hot”, other colours may appear. The image to the left, contributed by photo­grapher Ryan Sinn, shows how hot photosites look like when viewed at 1600 %.

Hot photosites typically appear with long exposures of one second or more.

Nikon users may be interested in the discussion about Nikon's policy on defect pixels that took place in the Nikon Forum on Photo.net in May 2006.

A hot photosite is similar in appearence to a stuck photosite. However, a stuck photosite is permanent, and unaffected by exposure times, while a hot photosite only appears as a result of long exposures. Hot photosites can (to some extent) be eliminated by dark-frame subtraction (the camera makes a second exposure with the shutter closed of the same duration as the first, and then substracts the latter from the former).

The source of hot photosites are individual sensor sites with a higher than normal rate of charge leakage. This leakage is sometimes referred to as dark current, and is also a major source of noise in digital images. Every sensor site has some dark current. If you expose long enough, any photosite could become a hot photosite.

The defects caused by dark current means that digital cameras are unable to take very long exposures. To overcome this limitation, one can take several shorter exposures and combine (or “stack”) them into a single long exposure (e.g. eight 30 seconds exposure can be stacked to create one 4 minute exposure).

High temperature and high ISO are two factors that will increase the number of hot photosites. Astronomers use liquid nitrogen (-196 Celsius) to cool their sensors to minimize the number of hot photosites during long exposures, but even a drop from, say, 20 degrees Celsius to 10 will make a noticeably improvement. At ISO 800 you'll notice more hot photosites than at ISO 200, simply because the signal, including the dark current, is more amplified.

While some hot photosites must be expected at long exposure times, hot photosites that appear at shutter speeds faster than 1/4 second should be rectified. For possible remedies, see the entry for stuck photosites.

Hot Spot

Hot spotWhen doing digital infrared photography, you may find a circular blob, similar to the one shown on the left, in the center of the frame. This is know as a hot spot, and is a result of internal reflections of infrared light produced by the lens' coatings. Some types of coating are not transparent to infrared wave­lengths.

The hot spot may be very prominent with some lenses, while other lenses does not display this defect. If you shoot infrared, you may want to look at my page on IR and Lenses to see what lenses are best suited for infrared photography.

JPEG Artefacts

The JPEG file format is know as a lossy file format because it saves storage space by compressing digital images by discarding or losing some of the original image data. The algorithm selecting the data to throw away is very clever, making use of scientific knowledge about the way the human visual system works. As a result, the data lost has very little impact on our visual perception of the image. Nevertheless, some changes are introduced, and these changes are known collectively as JPEG artefacts

The visibility of these artefacts depends on the degree of compression we apply when storing a JPEG image. A low degree of compression save some space, while keeping distortions to a minimum. A high degree of compression saves a lot more space, at the cost of more prominent distortions.

The image below demonstrates what damage too much JPEG compression can do to an image. Mouse over image (requires Javascript) or click here to see the uncompres­sed image. Note that this is a demonstration using extreme compression settings to illustrate what JPEG artefacts look like. Don't expect to see such visible defects in ordinary JPEG compressed images.

makeover

The most visible artefact is the so-called mosquito noise. This is blotches of colour noise surrounding high contrast edges, slightly resembling a swarm of mosquitos. In the sample image, it is particularly visible in areas in the blue sky where it meets the building.

Another, often quite subtle; defect is loss of detail in areas with a lot of high-frequency data, such as hair, fur and foliage. In the sample image this defect is most visible in the grass and leaves.

A third artefact, that normally only becomes visible at the rather extreme compression rate used in the image sample that accompanies this text, is banding. You can see this in the sky in the sample image above.

Finally, because the JPEG algorithm works on fixed blocks on pixels (usually 8 x 8 pixel squares), sometimes a “mosaic” of pixel squares appears.

JPEG artefacts The image to the left is a crop blown up to 400 %. The 8 x 8 pixel blocks are clearly visible in the reconstructed image.

JPEG compresses images by transforming them into the YCbCr colour space and then breaking up the image into blocks of (normally) 8 x 8 pixel blocks. Each block is analyzed with some­thing known as discrete cosine transform which turns each 8 x 8 pixel block into an 8 x 8 array of sig­nal freq­uen­cies present in the block. We can now ana­lyze these freq­uen­cies and throw away a signal below a cer­tain thres­hold. The higher we set this threshold, the more aggresive is the com­pres­sion.

One thing you should know about JPEG compression is that the damage its causes is cumulative. Every time a file is re-saved in JPEG-format, new artefacts are added to the old. While the artefacts caused by JPEG compression to first-generation JPEG files is (nearly) invisible at normal magnifications, the cumulative damage that can be seen in later generations can be much more prominent. This means that you should not use JPEG if you intend to post-process your images. Instead, use a non-lossy file format such as RAW, PNG, TIFF, or PSD. Even if the original photo is JPEG, you should convert to a non-lossy format for post-processing, and then, if necessary, convert back into JPEG (e.g. for use on the web) after you've finished processing the image.

Line Noise

EOS 20D line noiseSometimes, artefacts in the shape of lines or stripes appear in a digital photo. This is known as line noise, the corduroy effect, or striping.

A special case of line noise was the horizontal line noise that sometimes would appear in images taken with early-production models of the Canon EOS 20D. The example shown on the left is a 100 % crop from a frame taken at ISO 1600 with a EOS 20D with firmware version 1.0.5. I've used curves to accentuate the line noise.

In the EOS 20D, the problem mostly affected underexposed areas of the images taken at high ISO-settings with use of the built-in flash. Canon resolved the issue in December 2004 with firmware update version 1.1.0 (the current latest firmware for the EOS 20D is 2.0.3).

D200 line noiseAnother special case was the vertical line noise that sometimes would appear in images taken with early-production models of the Nikon D200 with a ISO-setting higher than ISO 100. The example shown on the left is a 100 % crop of a high contrast transition between a bright lightbulb and a darker background. I've used curves to accentuate the line noise.

Nikon responded to the problem in February 2006 with an advisory that said:

“If you experienced this pattern occurring in your images, please contact your nearest Nikon service representative. Nikon will adjust the image output level so that the pattern of lines will become virtually undetectable.”

(The full text of the advisory is at the Nikon European Support Centre website.)

Maze Artefacts

Maze artefacts, such as those dominating the 300 % crop shown below left (provided by Paul Furman, used with permis­sion) are distortions in digital image created by (presumably) faulty software demosaicing RAW data from a digital sensor.

Maze artefacts

The defect is caused by software and digital signal processing, and not caused by the camera. Different RAW-converters produce this maze pattern in varying degrees. Maze artefacts may be very visible in the outout from one RAW-converter, and totally absent in the output from another.

Moiré

In digital images, a moiré pattern is an interference pattern that appear in images when a regular pattern, such as fine fabric or a row of fenceposts, is undersampled.

Moiré can be avoided by the use of a so-called anti-alias-filter on the digital sensor. This is an optical filter that cuts down the sampling frequency to half of the imager's sensel pitch, keeping it below the Nyquist frequency. But because an anti-alias filter also will make digital images appear softer and less sharp, some manufacturers choose to make cameras without, or with a weak anti-alias filter. Cameras without an anti-alias filter include the Kodak DCS Pro 14n and Sigma SD10, while the Nikon D1h and the Canon EOS 5D are cameras with a weak anti-alias filter.

The image below, taken with a Canon EOS 5D, shows the effect of false colour moiré. It is typical of a Bayer sensor with a too weak anti-alias filter.

false colour moire

Different RAW-converters treat moiré different. Even if moiré show up in an image converted with a specific RAW-converter, all is not lost. Switching to a different program to de-mosaic the Bayer channels will yield a different, and sometimes better, result.

Because of the special Foveon-sensor used in the Sigma SD10, the camera is immune to false colour moiré. There is no anti-alias-filter in this camera, and as a result, luma moiré may appear when a regular pattern is sampled at a frequency close to, or above, the camera's sensel pitch. In the photograph below (provided by Frits Thomsen, used with permission), the moiré effect is visible in the stretch of railing between the two last lampposts.

monochrome moire

Noise

G5 noise, ISO 400Noise in digital images is most visible in uniform surfaces (such as blue skies) as monochromatic grain (luminance noise) and/or as colored speckles (color noise).

The image to the left shows a typical sample. It is a 100 % crop of the noise pattern produced by a Canon Powershot G5 at ISO 400.

The cause of noise in digital images is the random currents that flow in the sensor and associated electronics, similar to the currents that create the background hiss of audio equipment.

Like hot photosites, noise will increase with higher ISO setting, and increasing temperature. Noise will decrease with increasing sensor site size. This is why DSLRs with large sensors can be used at a much higher ISO setting than compact digicams with a small sensor.

In a Bayer camera, noise is typically more visible in the red and blue channels than in the green channel.

It is possible to reduce noise with software. Most digital cameras apply noise removal to images to enable higher ISO values than would otherwise be possible. There also exists a number of software packages (e.g. Neat Image, Noise Ninja) that lets the photographer reduce noise levels through post-processing. Noise reduction works well when applied in moderation. Used excessively, it tend to produce images lacking any real detail and giving skin and hair a wax-like appearence.

Sharpening Halos

Digital images, whether produced by a digital camera or scanned, are at the outset softer than images printed from film. The usual way to remedy this is to apply software sharpening to the image. A cheap compact digicam will do this automatically inside the camera, a more expensive DSLR may leave this task to the photographer.

There are more ways than one to sharpen a photo­graph, but the most popular one is without doubt the oddly named unsharp mask (USM). The name has its roots in the pre-digital world and originally referred to the use of an unsharp (blurred) positive film mask made from the negative. When this mask was contact printed in register with the negative, the contact print would accentuate the edges present in the image, resulting in an image with a crisper and sharper look. In digital image processing, the film mask is replaced by an inverted mask created using gaussian blur, which is then subtracted from the original image.

The principle behind the USM is to exaggerate the light-dark contrast between the two sides of an edge transition. Done right, this can have stunning effect on the appearent sharpness of a digital image. But if the effect is overdone, a visible halo will appear around edges, and texture in flat areas will be artifically exaggerated, making human skin appear pockmarked and noise stand out.

USM sample USM sample, detail

In the example above, the visible halo around the branches and the spire is the result of oversharpening. The image on the right is a 1000 % crop of the top of the spire.

Edge transitions w. and w/o USM
The images and graphs above visualize how the unsharp mask (USM) works. The image on the left show an edge transition before sharpening. The lighter side of the transition has a light­ness of 120/255, the darker side has a lightness of 60/255. This values are plotted in the graph shown directly under the transition. The result of sharpening is shown on the right. The USM accentuates the transition, increasing the level on the lighter side (160/255), and decreasing it on the darker side (20/255 – before tapering off to the original values (120/255 and 60/255 respectively). If this effect is overdone, the accentuated edge tranition will be visible as a halo in the image.

Staircase Artefacts

JaggiesStaircase artefacts (also referred to as “jaggies”) manifest themselves by smooth lines or curves having a jagged edge or staircase-like appearence. In the photo on the left, staircase artefacts are clearly visible in the curved edge in the middle of the frame.

In a digital image, staircase artefacts will appear when a smooth edge at an oblique angle is undersampled. The defect will be amplified by indiscriminate use of the unsharp mask. Like moiré, staircase artefacts can be reduced by the use of a so-called anti-alias-filter on the digital sensor.

Stuck Photosites

Stuck photosites (also referred to as “dead photosites”, or – somewhat inaccurately – as “dead pixels”) are photosites that are locked in single state. The cause of this defect is tiny defects in the digital's camera sensor matrix.

Stuck photosites usually look like appear a coloured cross, just like hot photosites. A hot photosite, however, only appears as a result of long exposures, while a stuck photosite is permanent, and unaffected by exposure times.

Nearly every digital image sensors manufactured today contain a small number of stuck photosites. If only “perfect” sensor matrixes were used in cameras, the low yield would make each sensor used prohibitivily expensive. Instead, camera manufacturers tolerate a small number of stuck photosites. As part of the production process, the camera is loaded with a map of affected photosites, and in-camera processing uses interpolation to exclude them (i.e. replacing each stuck photosite with the average of the surrounding photosites of the same colour). As a result, stuck photosites is not normally visible.

If such a defect arises later in a camera that uses Bayer interpolation, and the RAW conversion software does not handle it, you'll see not only a single defect pixel but several, as the Bayer reconstruction algorithm distributes any photosite value to several pixels. Rectifying this problem will normally entail returning the camera to the manufacturer to have them update the defect photosite map.

Some higher end Olympus cameras contain software that let the user re-map stuck photosites. Many RAW converters, including Adobe ACR, are reported to remove hot or stuck photosites automatically behind the scenes during RAW conversion. There is also specialized software, such as PixelZap from TawbaWare or PixelFixer that can be used to map out hot or stuck photosites.

Vignetting

Vignetting refers to a reduction in image brightness in the image periphery compared to the image center. It may be caused by special filters and used for creative effect, but usually is used to describe unwanted darkening of the corner of a photograph.

There are four different types of unwanted vignetting:

The first three types can appear on images captured on film as well as on a digital sensor. For details about these types of vignetting, see the Wikipedia article on vignetting.

The last type – pixel vignetting – only affects digital cameras. It is caused by the physical depth of the photon wells that capture light on the CMOS or CCD sensors used in a digital camera. Just like more light reaches the bottom of a well when the sun is in zenith, light hitting a photon well at a right angle will have greater impact than light hitting it at an oblique angle.

Like natural vignetting, pixel vignetting is most prominent with wide-angle lenses and when using cameras with a short register distance (e.g. rangefinders). It can to some extent be mitigated by using lenses based upon telecentric and retrofocus optical designs.

Most digital cameras use built-in image processing software to compensate for natural vignetting and pixel vignetting when converting RAW sensor data to standard image formats such as JPEG or TIFF.

Acknowledgements: Thanks to Leif Y. Carlstedt, Paul Furman and Michael Schnell for helpful comments and corrections. Any errors that remain are my own.


About DPanswers | Terms of Service | Privacy Policy | FAQ | Contact
Copyright © 2009 Hannemyr Nye Medier AS. All rights reserved.