How to calibrate your monitor
17th Jul 2011 | 09:00
Adjust your screen for the best colour reproduction
Most monitors aren't correctly adjusted. It's a fact. The result is that you don't see all the detail in onscreen images, and the colours aren't accurate.
Atone time this might only have been an issue for professional photographers, but today, when just about everyone stores, edits and views their photographs digitally, we all need a properly adjusted monitor.
It's not just a matter of onscreen viewing either - a poorly adjusted monitor can also result in disappointing hard copies.
Having a correctly adjusted monitor is important in areas other than digital photography, too. Even if you're just browsing the web or playing games, unless your monitor setup is correct, you could be missing out by not seeing images in the best possible light.
Monitor adjustment can be a very complicated and costly process if you use specialised hardware to do it, but it doesn't have to be that way.
Here we'll look at some of the simple, free ways of setting up your monitor.
Certainly a professional-level photographer or a very serious amateur might need more advanced tools, but even the simple methods we look at here are capable of making a vast difference to onscreen images.
Before proceeding, make sure your monitor has been turned on for at least 30 minutes so that it's had time to warm up, and restore its default settings using your graphic card's control program. This will probably force at least 24-bit colour, and in the case of an LCD monitor, will return the panel to its native resolution. If not, make sure to select these settings before continuing.
The easiest, and perhaps most fundamental thing to adjust, is your monitor's colour temperature. Using the official jargon, the colour temperature of light is the temperature of what's known as an 'ideal black body radiator', at which the colour of the light and the black body are identical.
Although an ideal black body radiator is a theoretical concept that doesn't actually exist in reality, it isn't too far removed from everyday experience. We're familiar, for example, with the fact that an object commonly glows red when it's heated to a sufficiently high temperature, and that if it's heated further it glows orange, then yellow and then white.
What might be less familiar is that at even higher temperatures, objects start to take on a bluish hue. If we restrict ourselves to the middle of this range, where we're talking about shades of white, the lower temperatures represent warm whites and the higher temperatures (paradoxically) correspond to cooler-looking whites.
For example, the colour temperature of a tungsten filament light bulb is about 2,500-2,900K (K stands for Kelvin, a unit of temperature that's the same as degrees Celsius, but offset such that 0K is equal to -273°C), while a halogen bulb is somewhat higher and an average sunlit scene is around 6,500K.
As shipped from the manufacturer, nearly all monitors are set up for a colour temperature of 9,300K.
This is the colour that pure white will appear, and while it might be a perfectly suitable setting applications such as word processing or engineering drawing applications, it's far too blue for photography, games or web browsing.
What you need is something closer to the colour of a daylight scene - around 6,500K. This is easily achieved using your monitor's setup menus, which are normally accessed using buttons on the front of the monitor (although details vary between manufacturers and models).
Select the menu entry for colour temperature and select 6,500K (sometimes shown as D65) instead of 9,300K (or D93). If you're used to using monitors set to 9,300K, this new setting will look decidedly dull by comparison, but persevere and your eyes will soon get used to the new and more accurate colours.
Just as most monitors are set to the wrong colour temperature for a large number of applications, it's common to find that the brightness and contrast aren't correctly adjusted either.
If the monitor is too dark, all shades of grey darker than a certain threshold will appear black, and if it's too bright then all shades of grey brighter than a certain threshold will appear white. The former means that you won't be able to see detail in dark areas of photographs such as shadows, while the latter means that detail in brightly lit areas will be lost.
Display a greyscale test chart on screen. You can make your own using a graphics package or choose one of the many charts available online.
If you're making your own chart, you need to make sure it has around 21 levels of grey from pure black to pure white, in steps of five per cent in brightness.
Ensure you're working in a room with subdued lighting, but not totally dark. The aim is to adjust the brightness and contrast on your monitor so that all the shades of grey can be distinguished from each other, using the lowest brightness setting that will achieve that.
This is very much a matter of trial and error, but as a starting point, try setting the contrast to 100 per cent for a CRT monitor or 40 per cent for an LCD monitor, and adjusting the brightness as necessary.
How easy this proves to be will depend on the quality of your monitor, and you may just find it's not achievable - in which case you'll have to accept a compromise.
If the brightness and contrast have been set up correctly, then acolour defined by the software as0, 0, 0 will appear as pure black, while 255, 255, 255 will appear aspure white.
These examples assume 24-bit colour. In each case, the first figure in the brackets represents the amount of red in the range 0 to 255, the second figure is the amount of green, and the third the amount of blue.
You might assume, therefore, that a value of (127, 127, 127), for example, would appear as 50 per cent grey, but this isn't necessarily the case. This would only be true if something called the gamma value was equal to one, which means that the relationship between the input bits and the brightness is linear.
In reality, because the human eye doesn't have a linear response to light, this setting wouldn't result in the most accurate rendering of photographs.
If the gamma setting is too low, mid-level tones will appear too light; if the gamma setting is too high, they will appear too dark.
The recommended value for gamma is 2.2, so for optimum results you should adjust the settings of your graphics card until an image appears as expected at this value.
Test charts used for gamma adjustment feature two squares, one filled in solid at a known grey level, and the other composed of alternating black and white bands so that it appears the same shade of grey as the solid square at the correct gamma setting.
To adjust the gamma value, display a gamma test chart onscreen.
Some are created specifically for a given gamma setting, in which case you should choose one for a value of 2.2, whereas others display a sliding scale so by looking for the closest match you can read from the gamma value for which your system is configured. This chart gives potentially better results by allowing you to carry out the adjustment while looking at three different shades of grey.
Look at the chart either from a distance or while squinting, so that the alternate black and white lines merge and you just see each square's average brightness.
Now adjust the graphics card's gamma setting until the grey square and the square with alternate black and white lines appear the same intensity for a gamma value of 2.2.The gamma is now set correctly.
Note that, confusingly, some graphics cards indicate that their default gamma setting is one. In such cases this isn't the actually gamma level, but a correction factor - the real default setting will probably be 2.2.
While the settings we've seen so far are by far the most important ones when it comes to ensuring colours look accurate on your screen, there's a possibility that your monitor might also exhibit a colour cast, even after the colour temperature has been set to 6,500K.
To fully address this problem, you'll need to use a specialised hardware calibration device, but it may be possible to make an improvement using software tools alone.
In addition to adjusting the overall gamma value, many graphics cards make it possible to adjust the gamma value individually for each of the three primary colours - red, green and blue.
Since incorrectly set gamma values are a possible cause of inaccurate colours, this is something quick and easy to try before going to the expense of the hardware solution.
The process for this is much the same as for adjusting the overall gamma value, except for the fact that instead of using the monochrome gamma test chart, you'll need separate gamma test charts for each primary colour. These are few and far between.
Bear in mind, though, that because it's such a dark colour, adjusting the gamma value for blue can be very difficult to do by eye, and it's possible that you could make things worse rather than better.
If this proves to be the case, you'll have to abandon the idea of adjusting the gamma value for each colour individually and return to adjusting the overall gamma setting instead.
Liked this? Then check out Best monitor: 12 LCD screens tested
Sign up for TechRadar's free Week in Tech newsletter
Get the oddest tech stories of the week, plus the most popular news and reviews delivered straight to your inbox. Sign up at http://www.techradar.com/register