Bookmark and Share

Digital defects

Stuff you don't want too see in your images.
by Gisle Hannemyr
Published: 2004-08-09.

Below is a rogues gallery of some of the more common image defects that may show up in your digital images.

Amp noise

Amp noiseAmp noise is a noise pattern along the edge of the image frame produced by interference (typical heat) by compo­nents close to the image sensor. The image to the left show the characteristic amp noise pattern of the Nikon D80 on a 3 minute exposure with the lens cap on at ISO 1600, boosted with auto levels to make the noise pattern stand out. The noise pattern doesn't show up at “normal” exposure times (if it does the camera is defective), but will appear at long exposures (along with a number of hot photosites). This means that the Nikon D80 is not best suited for astronomical or long night exposure work. See also the discussion om amp noise in Thom Hogan's D80 review (scroll down the page about 60 % to the heading “Noise”.)


BandingBanding (also known as posterisation and quantification noise) is a defect that is characterised by abrupt changes in tones or colour, rather than smooth transients.

In the image shown to the left, the obvious banding visible in the sky has been introduced deliberately, to demonstrate the effect. When banding occur as an unwanted image defect, the bands of colour may be less obvious.

When looking at the histogram of an image, banding will appear as gaps within the histogram.

Banding may have several different causes, such as:

  • When JPEG image is over-compressed, characteristic banding in 8 x 8 pixel squares will appear in areas where one would expect smooth tones, such as the sky.
  • When the photographer uses a too shallow bit depth (i.e. 8 bits/channel) for extensive image processing or for processing wide gamut colour spaces (e.g. Adobe RGB 1998, ProPhoto RGB or EktaSpace), banding may appear, in particular in the shadows.
  • If underexposed areas in a digital image are lifted by applying an S-shaped tone curve to them, banding may appear in the shadows because very few bits are used to represent the darkest tones in a digital image.

The banding effect may also be introduced deliberately, for artistic effect, e.g. to imitate the look of pop-art posters.

Bayer artifacts

Bayer artifactsBayer filter reconstruction can under certain conditions produce rain­bow-like colour artifacts along the edge between any two contrasting colours. The image crop on the left shows a typical example of this type of image defect.

There exists a number of different programs that do Bayer reconstruction. These are often referred to as RAW-converters. Each RAW-converter uses its own, often proprietary, Bayer reconstruction algorithm. The choice of algorithm influences both sharpness and presence of Bayer artifacts.

For an illustrated explanation of colour image recon­struc­tion using a Bayer filter mosaic, see the Wikipedia article on Bayer filters and Martin Brown's article on Bayer Masks.


BloomingBlooming is a defect that is caused by over-saturation of the individual sensor sites (photosites or sensels) that make up a digital CCD image sensor.

CCD sensors are typically designed to allow easy vertical shifting of the charge. Potential barriers are created to reduce flow into neighbouring horizontal sensor sites. Hence the excess charge will preferentially flow into the nearest vertical neighbours. Blooming will therefore be more prominent in the vertical direction (in a landscape oriented photo). The excess charge spills into the neighbouring sensor sites creating overexposure in them as well.

In the example above, taken with a digital camera introduced in 2004, the vertical bar above and below the bright sun is caused by blooming.

Blooming only affects CCD sensors. CMOS sensors does not suffer from this defect. However, CCD sensor technology has improved since 2004. As a result, newer CCD-cameras are much less susceptible to blooming than older models.

To avoid blooming during the long exposures associated with night photography and astro­photography, you may instead take several shorter exposures, cutting the exposure time just before the brightest object has begun to bloom, and then combine (or “stack”) them into a single long exposure (e.g. eight 30 seconds exposure can be stacked to create one 4 minute exposure).

Chromatic aberration

CAChromatic aberration (CA) is an optical distortion that caused by the camera's lens. Unlike most other defects listed on this page, CA will appear on images captured on film as well as on a digital sensor. However, because of our tendency to pixel peep, CA is often perceived as more of a problem with digital than with film.

CA is a distortion where a pronounced colour distortion appears near edge transients in a photograph. Some people use the term purple fringing for CA, but the colour distortion can be any colour, not only purple.

CA occurs when a bright object or light source containing components from more than one of the basic colours (Red, Green and Blue) does not focus at the same point on the surface of the sensor. Also, because some digital cameras are sensi­tive to infrared light, (which has a different focus point than visible light) CA may be caused by an infrared component exposing the sensor.


One of the things that characterises CA is that the distortion exhibits a radial pattern. As shown in the example above, the off-colour fringe will be to one side of the transition at the left edge of the image, to the other side of transition at the left edge of the image, and centred around the transition at the centre of the image.

The regular and radial pattern that characterises CA makes it sometimes possible to correct for CA in post-processing, by carefully realigning the three colour planes relative to each other. Tools that lets you do this is Helmut Dersch's excellent free software program, Panorama Tools, and DxO Optics Pro. Some cameras even feature built-in chromatic aberration correction.

CA is conspicuous in many of the cheap “kit” lenses that comes with entry level camera bodies at very little extra cost. More expensive lens designs, often built around the use aspherical elements and low-dispersion glass elements, is much less prone to CA.


Dust on the sensorGetting dust inside the mirror box of a DSLR with inter­changeable lenses is unavoidable, even if you take great care when changing lenses in the field. Dust appears as darkish spots in areas with uni­form colour (e.g. grey or blue sky). It becomes more pronounced at shorter focal lengths and smaller apertures. The 100 % crop on the left shows a typical example of how dust on the sensor looks like.

To get rid of dust spots of the sensor, you need to clean it. A number of different cleaning met­hods, both wet and dry, is available.

Hot photosites

Hot photositesA hot photosite (also referred to as hot pixel or defect pixel) appears in the photograph as a coloured cross. Usually it is in the colour of the colour filters that make up the sensor's Bayer matrix (red, green or blue), but if two or more neighbouring photosites become “hot”, other colours may appear. The image to the left, contributed by photo­grapher Ryan Sinn, shows how hot photosites look like when viewed at 400 %.

Hot photosites typically appear with long exposures of one second or more.

Nikon users may be interested in the discussion about Nikon's policy on defect pixels that took place in the Nikon Forum on in May 2006.

A hot photosite is similar in appearance to a stuck photosite. However, a stuck photosite is permanent, and unaffected by exposure times, while a hot photosite only appears as a result of long exposures. Hot photosites can (to some extent) be eliminated by dark-frame subtraction (the camera makes a second exposure with the shutter closed of the same duration as the first, and then subtracts the latter from the former).

The source of hot photosites are individual sensor sites with a higher than normal rate of charge leakage. This leakage is sometimes referred to as dark current, and is also a major source of noise in digital images. Every sensor site has some dark current. If you expose long enough, any photosite could become a hot photosite.

The defects caused by dark current means that digital cameras are unable to take very long exposures. To overcome this limitation, one can take several shorter exposures and combine (or “stack”) them into a single long exposure (e.g. eight 30 seconds exposure can be stacked to create one 4 minute exposure).

High temperature and high ISO are two factors that will increase the number of hot photosites. Astronomers use liquid nitrogen (-196 Celsius) to cool their sensors to minimise the number of hot photosites during long exposures, but even a drop from, say, 20 degrees Celsius to 10 will make a noticeably improvement. At ISO 800 you'll notice more hot photosites than at ISO 200, simply because the signal, including the dark current, is more amplified.

While some hot photosites must be expected at long exposure times, hot photosites that appear at shutter speeds faster than 1/4 second should be rectified. For possible remedies, see the entry for stuck photosites.

Hot spot

Hot spotWhen doing digital infrared photography, you may find a circular blob, similar to the one shown on the left, in the centre of the frame. This is know as a hot spot, and is a result of internal reflections of infrared light produced by the lens' coatings. Some types of coating are not transparent to infrared wave­lengths.

The hot spot may be very prominent with some lenses, while other lenses does not display this defect. If you shoot infrared, you may want to look at my page on IR and Lenses to see what lenses are best suited for infrared photography.

JPEG artifacts

The JPEG file format is know as a lossy file format because it saves storage space by compressing digital images by discarding or losing some of the original image data. The algorithm selecting the data to throw away is very clever, making use of scientific knowledge about the way the human visual system works. As a result, the data lost has very little impact on our visual perception of the image. Nevertheless, some changes are introduced, and these changes are known collectively as JPEG artifacts

The visibility of these artifacts depends on the degree of compression we apply when storing a JPEG image. A low degree of compression save some space, while keeping distortions to a minimum. A high degree of compression saves a lot more space, at the cost of more prominent distortions.

The image below demonstrates what damage too much JPEG compression can do to an image. Mouse over image (requires Javascript) or click here to see the uncompres­sed image. Note that this is a demonstration using extreme compression settings to illustrate what JPEG artifacts look like. Don't expect to see such visible defects in ordinary JPEG compressed images.


The most visible artifact is the so-called mosquito noise. This is blotches of colour noise surrounding high contrast edges, such as the blades, slightly resembling a swarm of mosquitoes.

Another, often quite subtle; defect is loss of detail in areas with a lot of high-frequency data, such as hair, waves, fur and foliage. In the sample image this defect is most visible in the sea.

A third artifact, that normally only becomes visible at the rather extreme compression rate used in the image sample that accompanies this text, is banding. You can see this in the sky in the sample image above.

Finally, because the JPEG algorithm works on fixed blocks on pixels (usually 8 x 8 pixel squares), sometimes a “mosaic” of pixel squares appears.

JPEG artifacts The image to the left is a crop blown up to 400 %. The 8 x 8 pixel blocks are clearly visible in the reconstructed image.

JPEG compresses images by transforming them into the YCbCr colour space and then breaking up the image into blocks of (normally) 8 x 8 pixel blocks. Each block is analysed with some­thing known as discrete cosine transform which turns each 8 x 8 pixel block into an 8 x 8 array of sig­nal freq­uen­cies present in the block. We can now ana­lyse these freq­uen­cies and throw away a signal below a cer­tain thres­hold. The higher we set this threshold, the more aggressive is the com­pres­sion.

One thing you should know about JPEG compression is that the damage its causes is cumulative. Every time a file is re-saved in JPEG-format, new artifacts are added to the old. While the artifacts caused by JPEG compression to first-generation JPEG files is (nearly) invisible at normal magnifications, the cumulative damage that can be seen in later generations can be much more prominent. This means that you should not use JPEG if you intend to post-process your images. Instead, use a non-lossy file format such as RAW, PNG, TIFF, or PSD. Even if the original photo is JPEG, you should convert to a non-lossy format for post-processing, and then, if necessary, convert back into JPEG (e.g. for use on the web) after you've finished processing the image.

Line noise

Line noise exampleSometimes, artifacts in the shape of lines or stripes appear in a digital photo. This type of artifact is often referred to as line noise, pattern noise, the corduroy effect, or striping.

An example of this defect is shown in the image to left. This is a 100 % crop of the green channel from an image taken with a Nikon D3000 (CCD) with a broken sensor, and show a very obvious example of this defect.

A camera that produces images with line noise this pronounced is clearly broken, and need to have its image sensor and/or readout amplifier array replaced.

There has also been cases where less severe line noise has affected early-production models of certain cameras, such as the Nikon D200 (CCD), Canon EOS 20D and EOS 7D (all CMOS).

Details about models that may have this problem below. The terms “vertical” and “horizontal” refer to landscape oriented images.

    Line noise example
  • Nikon D200: Vertical line noise would appear in images taken a ISO-setting higher than ISO 100 (see sample of an ISO 1000 image at 100 % crop right). Nikon responded to the problem in February 2006 with an advisory on the Nikon European Support Centre website (no longer on-line) that said: “If you experienced this pattern occurring in your images, please contact your nearest Nikon service representative. Nikon will adjust the image output level so that the pattern of lines will become virtually undetectable.”
  • Canon EOS 20D: Horizontal line noise would appear in underexposed areas of images taken at high ISO-settings with use of the built-in flash. Canon resolved the issue in December 2004 with firmware update version 1.1.0 (the latest firmware for the EOS 20D is 2.0.3).
  • Line noise example
  • Canon EOS 7D: Vertical line noise at low ISO has been seen, but not on all cameras. (see sample of an ISO 100 image at 100 % crop right). Problem does not seem to be widespread. There is no response from Canon (yet).

It does not seem to be a single cause behind the different manifestations of the problem. CCD and CMOS images sensors works different. A CCD image sensor transfers accumulated charge from its photon wells by shifting the contents to a row of charge amplifiers located at long end of the sensor. A CMOS sensor usually have an individual readout circuit for each photon well.

In CCD sensors, the lines are always vertical, and the main cause of the problem seems to be unbalanced amplifiers in the readout array at the long end of the sensor. I do not know about an explanation for line noise in a CMOS sensor.

Maze artifacts

Maze artifacts, such as those dominating the 400 % crop shown below left (provided by Paul Furman, used with permis­sion) are distortions in digital image created by (presumably) faulty software de-mosaicing RAW data from a digital sensor.

Maze artifacts

The defect is caused by software and digital signal processing, and not caused by the camera. Different RAW-converters produce this maze pattern in varying degrees. Maze artifacts may be very visible in the output from one RAW-converter, and totally absent in the output from another.


In digital images, a moiré pattern is an interference pattern that appear in images when a regular pattern, such as fine fabric or a row of fence-posts, is under-sampled.

Moiré can be avoided by the use of a so-called anti-alias-filter on the digital sensor. This is an optical filter that cuts down the sampling frequency to half of the imager's sensel pitch, keeping it below the Nyquist frequency. But because an anti-alias filter also will make digital images appear softer and less sharp, some manufacturers choose to make cameras without, or with a weak anti-alias filter. Cameras without an anti-alias filter include the Kodak DCS Pro 14n and Sigma SD10, while the Nikon D1h and the Canon EOS 5D are cameras with a weak anti-alias filter.

The image below, taken with a Canon EOS 5D, shows the effect of false colour moiré. It is typical of a Bayer sensor with a too weak anti-alias filter.

false colour moire

Different RAW-converters treat moiré different. Even if moiré show up in an image converted with a specific RAW-converter, all is not lost. Switching to a different program to de-mosaic the Bayer channels will yield a different, and sometimes better, result.

Because of the special Foveon-sensor used in the Sigma SD10, the camera is immune to false colour moiré. There is no anti-alias-filter in this camera, and as a result, luma moiré may appear when a regular pattern is sampled at a frequency close to, or above, the camera's sensel pitch. In the photograph below (provided by Frits Thomsen, used with permission), the moiré effect is visible in the stretch of railing between the two last lampposts.

monochrome moire


G5 noise, ISO 400Noise in digital images is most visible in uniform surfaces (such as blue skies) as monochromatic grain (luminance noise) and/or as colored speckles (color noise).

The image to the left shows a typical sample. It is a 100 % crop of the noise pattern produced by a Canon Powershot G5 at ISO 400.

The cause of noise in digital images is the random currents that flow in the sensor and associated electronics, similar to the currents that create the background hiss of audio equipment.

Like hot photosites, noise will increase with higher ISO setting, and increasing temperature. Noise will decrease with increasing sensor site size. This is why DSLRs with large sensors can be used at a much higher ISO setting than compact digicams with a small sensor.

In a Bayer camera, noise is typically more visible in the red and blue channels than in the green channel.

It is possible to reduce noise with software. Most digital cameras apply noise removal to images to enable higher ISO values than would otherwise be possible. There also exists a number of software packages (e.g. Neat Image, Noise Ninja) that lets the photographer reduce noise levels through post-processing. Noise reduction works well when applied in moderation. Used excessively, it tend to produce images lacking any real detail and giving skin and hair a wax-like appearence.

Sharpening halos

Digital images, whether produced by a digital camera or scanned, are at the outset softer than images printed from film. The usual way to remedy this is to apply software sharpening to the image. A cheap compact digicam will do this automatically inside the camera, a more expensive DSLR may leave this task to the photographer.

There are more ways than one to sharpen a photo­graph, but the most popular one is without doubt the oddly named unsharp mask (USM). The name has its roots in the pre-digital world and originally referred to the use of an unsharp (blurred) positive film mask made from the negative. When this mask was contact printed in register with the negative, the contact print would accentuate the edges present in the image, resulting in an image with a crisper and sharper look. In digital image processing, the film mask is replaced by an inverted mask created using gaussian blur, which is then subtracted from the original image.

The principle behind the USM is to exaggerate the light-dark contrast between the two sides of an edge transition. Done right, this can have stunning effect on the appearent sharpness of a digital image. But if the effect is overdone, a visible halo will appear around edges, and texture in flat areas will be artifically exaggerated, making human skin appear pockmarked and noise stand out.

USM sample

In the example above, the visible halo around the branches and the spire is the result of oversharpening.

Edge transitions w. and w/o USM

The images and graphs above visualize how the unsharp mask (USM) works. The image on the left show an edge transition before sharpening. The lighter side of the transition has a light­ness of 120/255, the darker side has a lightness of 60/255. This values are plotted in the graph shown directly under the transition. The result of sharpening is shown on the right. The USM accentuates the transition, increasing the level on the lighter side (160/255), and decreasing it on the darker side (20/255 – before tapering off to the original values (120/255 and 60/255 respectively). If this effect is overdone, the accentuated edge tranition will be visible as a halo in the image.

Staircase artifacts

JaggiesStaircase artifacts (also referred to as “jaggies”) manifest themselves by smooth lines or curves having a jagged edge or staircase-like appearence. In the photo on the left, staircase artifacts are clearly visible in the curved edge in the middle of the frame.

In a digital image, staircase artifacts will appear when a smooth edge at an oblique angle is undersampled. The defect will be amplified by indiscriminate use of the unsharp mask. Like moiré, staircase artifacts can be reduced by the use of a so-called anti-alias-filter on the digital sensor.

Stuck photosites

Stuck photosites (also referred to as “dead photosites”, or – somewhat inaccurately – as “dead pixels”) are photosites that are locked in single state. The cause of this defect is tiny defects in the digital's camera sensor matrix.

Stuck photosites usually look like appear a coloured cross, just like hot photosites. A hot photosite, however, only appears as a result of long exposures, while a stuck photosite is permanent, and unaffected by exposure times.

Almost every digital image sensor manufactured today contains a small number of stuck photosites. If only “perfect” sensor matrixes were used in cameras, the low yield would make each sensor used prohibitivily expensive. Instead, camera manufacturers tolerate a small number of stuck photosites. As part of the production process, the camera is loaded with a map of affected photosites, and in-camera processing uses interpolation to exclude them (i.e. replacing each stuck photosite with the average of the surrounding photosites of the same colour). As a result, stuck photosites is not normally visible.

If such a defect arises later in a camera that uses Bayer interpolation, and the RAW conversion software does not handle it, you'll see not only a single defect pixel but several, as the Bayer reconstruction algorithm distributes any photosite value to several pixels. Rectifying this problem will normally entail returning the camera to the manufacturer to have them update the defect photosite map.

Some Olympus and Pentax cameras come with software that let the user re-map stuck photosites. Many RAW converters, including Adobe ACR, are reported to remove hot or stuck photosites automatically behind the scenes during RAW conversion. There is also specialized software, such as PixelZap from TawbaWare or PixelFixer that can be used to map out hot or stuck photosites.


Vignetting example.Vignetting refers to a reduction in image brightness in the image periphery compared to the image center.

Vignetting may be caused by special filters or introduced deliberately in post-processing of an image for creative effect, but usually the word vignetting is used to describe unwanted darkening of the corners of a photograph, caused by camera settings or by equipment limitations.

A typical example of vignetting can be seen in the image to the left. This particular image is captured by a Holga, a cheap roll film camera made in East Germany that was notorious for its optical vignetting.

There are four different types of unwanted vignetting:

  • Mechanical (or physical) vignetting
  • Optical vignetting
  • Natural vignetting
  • Pixel vignetting

The first three types can appear on images captured on film as well as on a digital sensor. For details about these types of vignetting, see the Wikipedia article on vignetting.

The last type – pixel vignetting – only affects digital cameras. It is caused by the physical depth of the photon wells that capture light on the CMOS or CCD sensors used in a digital camera. Just like more light reaches the bottom of a well when the sun is in zenith, light hitting a photon well at a right angle will have greater impact than light hitting it at an oblique angle.

Like natural vignetting, pixel vignetting is most prominent with wide-angle lenses and when using cameras with a short register distance (e.g. rangefinders). It can to some extent be mitigated by using lenses based upon telecentric and retrofocus optical designs.

Most digital cameras use built-in image processing software to compensate for natural vignetting and pixel vignetting when converting RAW sensor data to standard image formats such as JPEG or TIFF.

Acknowledgements: Thanks to Leif Y. Carlstedt, Paul Furman and Michael Schnell for helpful comments and corrections. Any errors that remain are my own.
Photo to illustrate vignetting is Plymouth Concrete Car Park by boliston. Used under CC BY 2.0.

Bookmark and Share

One response:

Other causes of halos

Great article. Much better than most of the uncritically repeated twaddle that you find all over the Internet.

It could be improved a little by pointing out that over-sharpening is not the only thing that can cause halos in digital images. Changes to the saturation and/or luminance of individual collours can also introduce them.

Log in to comment.

You need to be logged in to leave a comment in this blog.

This page is from: