Hello there,
I have a question about choosing the "right" gamma value and I am interested to find your point of view.
When it comes to color management and calibration, reading in official/unofficial sources, there are different opinions telling that the native display gamma value can be left "as is" = "native" (2.43 measured in my case) preserving the color depth OR, on the other hand, gamma value could be set at different values (e.g. 2.2 , 1.8) which will give us more accurate color rendition;
As far as I've read, creating 1D LUT for video card (8-bit to 8-bit), forcing the display to use 2.2 or any other value instead of native gamma value, will cut the colour depth. (please see the chapter
Limitations of monitor calibration)
On the other hand, taking in consideration that I'll use the monitor to create/view a future printed material or intend to display the content on other sRGB calibrated monitors, somehow I guess that I
should use a gamma value set at 2.2 (sRGB).
I have to mention that my LCD is a 8-bit/channel P-MVA and doesn't have interactive gamma adjustment on it or possibility to implement internal 1D LUT (1D LUT will be implemented on the videocard using a Spyder4PRO or Colormunki Display device for calibration).
How can I approach this, which is the "right" seen color, at native monitor gamma value or other value, excepting all the other variables? (room light condition, color temperature, etc considering these the same in all the cases).
Any thoughts?
Thank you,