Preamble: If this thread is posted in the wrong section, please move it - I don't quite know where to best post this.
I've read the articles at
http://www.steves-digicams.com/knowl...g-intents.html,
https://www.cambridgeincolour.com/tu...conversion.htm and
http://en.wikipedia.org/wiki/Color_m...ndering_intent but I still don't quite understand why relative colorimetric intent is preferable to absolute colorimetric (for monitors).
If I'm displaying a photograph or movie file on my monitor (which has a white point slightly off from the ideal 6504K for sRGB), wouldn't I want to use absolute colorimetric mode to ensure that colors display the way they would on an ideal sRGB monitor?
In particular, this quote confuses me: “...using the absolute colorimetry rendering intent would ideally [...] give an exact output of the specified CIELAB values. Perceptually, the colors may appear incorrect, but instrument measurements of the resulting output would match the source.”
If the instrument measurements of the actual colors emitted from the monitor are 100% identical to the source material I'm viewing (photograph, movie frame), then how come colors would appear incorrect?
Wouldn't it look *exactly* the same as if I had viewed the same scene in real life (assuming the camera and display are perfect)?
Furthermore, if an image is tagged as sRGB (6504K white point) and I view it on a monitor with a 5500K white point, wouldn't I want the image to display using the actual, blueish hue that I would get on a perfect sRGB monitor? I wouldn't want images to appear with a 5500K white point because that would not be accurate to the source at all.
Why is it, then, that everywhere I see, relative colorimetric mode is advertised instead of absolute? It seems counter-intuitive to me.
Any explanations would be greatly appreciated.