


In digital cameras anti-aliasing filters typically work by splitting a point of light into four individual points as shown below in these diagrams from Nikon for the D800 and D800E. Naturally, such an algorithm is not going to get me a job at Canon or Nikon any time soon and, as you can no doubt imagine, the actual algorithms used are far more complicated (and thus require more processing power for in-camera JPEGs), but nevertheless the aim is the same: to get values for the other two unknown primary colours by extrapolating from the data available, and so by their very nature they can be fooled. This gives ( 0, 127, 255).Īs you can see, we have ended up with a block of blues and greens when actually the camera should have just recorded one black line in the middle and white either side. We then run an algorithm to extrapolate the RGB for each pixel that says for each of the other two colours each pixel will take the average value from the surrounding 8 pixels relating to that particular colour (we’re only going to do this for the centre pixels surrounded by the yellow line).įor example, the top-middle red pixel in the yellow box… RED: this is a red pixel so it records its own value, 0 GREEN: 4 pixels in the surrounding 8 are green, 2 have a value of 255 and 2 have a value of 0, so the average of 127 is recorded BLUE: 4 pixels in the surrounding 8 are blue and all have a value of 255 so 255 is recorded. Black has an RGB value of (0, 0, 0) and so the only two values recorded are either 0 or 255. We record the luminance value at each pixel point. Here we have a Bayer filter attempting to record three vertical black lines that are the exactly same size as the sensor area below (and so do not cover any blue pixels). To show how this 2D RGB arrangement (in particular one that has such a simple repeating structure) can lead to false colour, take a look at this crude example.

Note also that all horizontal and vertical lines in this arrangement lack either red or blue pixels. This is especially an issue for the red and blue channels where the frequency is half of that of the green. This is an issue because the spacing between pixels of like colour is greater than the spacing in the image in areas of fine detail. Generally speaking, it is less of an issue in landscape photography where the presence of fine repeating patterns is rare.įalse colour is a result of the structured pattern of the Bayer filter which consists of rows of RGRGRG or BGBGBG. This is why choosing a lower aperture (or higher f-number) can also reduce moiré as the onset of diffraction helps to reduce image sharpness (equally, it is usually most obvious under the best lens conditions for sharpness- ƒ/8 and centre of the frame). By slightly blurring the image the anti-aliasing filter is lowering the frequency of the pattern to a level below the sampling frequency, but is doing so at the expense of image sharpness. This is a common example of temporal aliasing.Īliasing (and thus moiré) is usually most obvious when the frequency of the pattern is close (but not equal) to the sampling frequency. Equally, at 6 rotations per second the spokes will move 90° and the wheel will appear to be stationary. Then at each film slide the wheels have rotated 75° clockwise (5 x 360°/24) but in fact it will appear to the observer watching the film that the wheel has moved backward 15°. Say you film a 4-spoke wagon wheel at a film rate of 24 frames per second and the wagon is moving at such a speed that its wheels rotate 5 times per second. However, it is perhaps easier to grasp the concept by thinking first of temporal aliasing. In digital photography we’re concerned with spatial aliasing in other words, the density of the sensor (in terms of pixels per area) versus the density of the pattern we are trying to photograph. The term aliasing refers to the fact that (1) two different signals can be become indistinguishable, or (2) a signal can produce distortions or artifacts when sampled. The presence of false colour is exacerbated by the structured layout of the Bayer filter layout used in most digital camera sensors. Moiré appears in the form of maze-like patterns not native to the original image and false colour in the form of colour transformations across the surface of the subject (again, not native to the original scene). weaving in a fabric, fine mesh photographed from a distance). Moiré and false colour occur in digital photography where the level of detail in a pattern exceeds the resolution of the sensor (e.g.

Anti-aliasing filters are optical low-pass digital filters which essentially blur the image to a fractional degree in order to avoid moiré and false colour. There is a trend among manufacturers-beginning really with Nikon and the release of the D800E in 2012-to remove the anti-aliasing filter from digital cameras.
