When is HD not HD (and 4k not 4k)?

In 1861, the Scottish physicist James Clerk Maxwell demonstrated the first colour photograph to the Royal Institution. Then (as today) film was fundamentally black and white. Maxwell reasoned that three photographs taken through three filters – red, green and blue – could be projected through the same filters and overlapped, creating a full colour image (actually, he tried a yellow filter too, but we’ll keep quiet about that). It worked, though it turned out his success was an accident. Photographic materials of that era were not sensitive to red light, so the photograph taken through the red filter should have been blank. Fortunately for Maxwell, the red dye in the tartan ribbon he chose as his subject reflected UV light – to which his plates were highly sensitive and which passed right through his red filter. What a lucky genius he was.
The whole red, green, blue thing caught on big time and is, of course, still used today in colour TV. Three CCD (or, indeed, three CMOS) video cameras use Maxwell’s principle – three, carefully aligned, black and white sensors photographing the same subject through red, green and blue filters. Getting the alignment right is tricky, particularly with HD. When the photosites (the individual light sensitive areas of the sensor) are only a few microns across, you have to be really, really good with glue. It’s very expensive.
So, increasingly, manufacturers are using a single sensor (just like stills cameras). Of course, the photosites are still, basically, black and white. Fortunately, Dr. Bryce E. Bayer of Eastman Kodak came up with a way of making a black and white sensor suitable for colour image capture. His idea was to put a tiny filter over each photosite. As the human eye is more sensitive to green than any other colour, the Bayer mosaic uses twice as many green photosites as red and blue (see the illustration).
The first thing to notice is that a 1080 sensor needs to generate 2073600 red, green and blue pixels in the output image (1920 x 1080). However, a 1080 Bayer sensor has only 1036800 green photosites and 518400 red and blue photosites, i.e. half the required number of green and only a quarter the number of red and blue. Software has to invent the missing colour values (it’s called interpolation, and it’s very clever, but it doesn’t get around the fact that the sensor doesn’t record red, green and blue values for every pixel position in the output image). So, the RED ONE™’s 4k sensor is effectively a 2k green sensor and 1k red and blue sensors.
It gets worse. All image sensors are ‘sampling’ the real world. That is, they convert the continuous real image from the lens into discrete pixels. Because of an awkward mathematic thing called the Nyquist-Shannon Sampling Theorem, sensor manufacturers need to put an optical low pass filter in front of the sensor. – essentially just limiting the amount of fine detail in the image hitting the sensor. If they don’t, you get a kind of Moiré pattern in that fine detail. This optical filter – how much of the fine detail it removes – needs to be related to the ‘sampling rate’ - in other words the number of photosites on the sensor. Oops! We already know that there are more green photosites than red or blue, so how do they design their filter? They can’t make one that suits the green channel or the red and blue will alias (as the funny moiré effect is known). Equally, if they make the filter okay for the red and blue, they are losing a lot of the image’s fine detail in the green channel. Most manufacturers seem to choose a compromise – they lose a bit of the detail from the green and put up with some aliasing on red and blue.
Do we care about all this? Probably not, but it lends weight to the argument that we should be pushing the sensor photosite count beyond 1920x1080 to get a good HD image. A 4k film scan will contain more true colour resolution than the output of a 4k camera with a Bayer pattern sensor (assuming that the lens/film combination was up to it). Of course, if you cram more photosites onto a sensor the image it produces suffers in other ways, and there are sensors that record RGB values for each photosite position, but that’s a discussion for another time…
blog comments powered by Disqus