Helpful Posts Helpful Posts:  0
Results 1 to 19 of 19

Thread: choosing the "right" gamma value

  1. #1

    choosing the "right" gamma value

    Hello there,
    I have a question about choosing the "right" gamma value and I am interested to find your point of view.

    When it comes to color management and calibration, reading in official/unofficial sources, there are different opinions telling that the native display gamma value can be left "as is" = "native" (2.43 measured in my case) preserving the color depth OR, on the other hand, gamma value could be set at different values (e.g. 2.2 , 1.8) which will give us more accurate color rendition;

    As far as I've read, creating 1D LUT for video card (8-bit to 8-bit), forcing the display to use 2.2 or any other value instead of native gamma value, will cut the colour depth. (please see the chapter Limitations of monitor calibration)
    On the other hand, taking in consideration that I'll use the monitor to create/view a future printed material or intend to display the content on other sRGB calibrated monitors, somehow I guess that I should use a gamma value set at 2.2 (sRGB).
    I have to mention that my LCD is a 8-bit/channel P-MVA and doesn't have interactive gamma adjustment on it or possibility to implement internal 1D LUT (1D LUT will be implemented on the videocard using a Spyder4PRO or Colormunki Display device for calibration).

    How can I approach this, which is the "right" seen color, at native monitor gamma value or other value, excepting all the other variables? (room light condition, color temperature, etc considering these the same in all the cases).

    Any thoughts?
    Thank you,

  2. #2

    Join Date
    Feb 2012
    Location
    Texas
    Posts
    6,956
    Real Name
    Ted

    Re: choosing the "right" gamma value

    Quote Originally Posted by bitdepth View Post
    Hello there,
    I have a question about choosing the "right" gamma value and I am interested to find your point of view.

    When it comes to color management and calibration, reading in official/unofficial sources, there are different opinions telling that the native display gamma value can be left "as is" = "native" (2.43 measured in my case) preserving the color depth OR, on the other hand, gamma value could be set at different values (e.g. 2.2 , 1.8) which will give us more accurate color rendition
    Where did you read that? As far as I know, neither color "depth" nor color accuracy have anything to do with gamma directly.



    As far as I've read, creating 1D LUT for video card (8-bit to 8-bit), forcing the display to use 2.2 or any other value instead of native gamma value, will cut the colour depth. (please see the chapter Limitations of monitor calibration)
    On the other hand, taking in consideration that I'll use the monitor to create/view a future printed material or intend to display the content on other sRGB calibrated monitors, somehow I guess that I should use a gamma value set at 2.2 (sRGB).
    Any thoughts?
    Thank you,
    sRGB gamma is not 2.2

    It is 2.4 with a linear portion with a slope of 12.92
    Last edited by xpatUSA; 20th March 2016 at 08:27 PM.

  3. #3
    dje's Avatar
    Join Date
    May 2011
    Location
    Brisbane Australia
    Posts
    4,636
    Real Name
    Dave Ellis

    Re: choosing the "right" gamma value

    Quote Originally Posted by bitdepth View Post
    Hello there,
    I have a question about choosing the "right" gamma value and I am interested to find your point of view.

    When it comes to color management and calibration, reading in official/unofficial sources, there are different opinions telling that the native display gamma value can be left "as is" = "native" (2.43 measured in my case) preserving the color depth OR, on the other hand, gamma value could be set at different values (e.g. 2.2 , 1.8) which will give us more accurate color rendition;

    As far as I've read, creating 1D LUT for video card (8-bit to 8-bit), forcing the display to use 2.2 or any other value instead of native gamma value, will cut the colour depth. (please see the chapter Limitations of monitor calibration)
    On the other hand, taking in consideration that I'll use the monitor to create/view a future printed material or intend to display the content on other sRGB calibrated monitors, somehow I guess that I should use a gamma value set at 2.2 (sRGB).
    I have to mention that my LCD is a 8-bit/channel P-MVA and doesn't have interactive gamma adjustment on it or possibility to implement internal 1D LUT (1D LUT will be implemented on the videocard using a Spyder4PRO or Colormunki Display device for calibration).

    How can I approach this, which is the "right" seen color, at native monitor gamma value or other value, excepting all the other variables? (room light condition, color temperature, etc considering these the same in all the cases).

    Any thoughts?
    Thank you,
    Hi bitdepth

    I use SpyderPRO 5 for monitor profiling and it does't have a gamma adjustment. You have to go to the top of the line SyderElite to get gamma adjustment. This suggests to me that for the average user, adjusting your monitor gamma is not terribly important. If you are viewing an image with an embedded ICC profile using colour managed software, then the colour management should look after gamma conversion if there is a difference between that in the profile and that of the monitor. In a non-colour managed situation, you are probably best having your monitor gamma close to sRGB. With my SpyderPRO, gamma is set to 2.2 which is a reasonable approximation of the sRGB gamma that Ted describes.

    Dave

  4. #4

    Re: choosing the "right" gamma value

    Quote Originally Posted by xpatUSA View Post
    Where did you read that? As far as I know, neither color "depth" nor color accuracy have anything to do with gamma directly.

    sRGB gamma is not 2.2
    It is 2.4 with a linear portion with a slope of 12.92
    thank you both for your reply
    @xpatUSA
    regarding the color "depth" here:
    LIMITATIONS OF MONITOR CALIBRATION

    Unfortunately, there are limits to how accurately you can calibrate your display. With a digital display, the more you have to change your monitor from its native state, the more you will decrease the number of colors/shades that it can display.

    regarding the sRGB gamma standard (2.2) in different places:
    like here (please read the first line) , here (please read the sRGB chapter), here, and so on...

    That's why I am confused because indeed, I also have read about 2.4 being sRGB gamma value

    @dje
    I was reffering at display hardware capability to adjust the gamma value / the capability to apply 1D LUT, not using the video card but directly on the monitor.
    Anyway, I can use Colormunki Display with DispcalGUI, there is no problem.

  5. #5

    Join Date
    Nov 2011
    Location
    Brisbane, Australia
    Posts
    1,107
    Real Name
    Tony Watts

    Re: choosing the "right" gamma value

    I use the x-rite i1 display pro to calibrate my monitor, an Apple Mac. I have the option to enter the gamma that is used and the general recommendation is to use 2.2, which I do. However, I have been wondering whether that is optimal.

    One thing that makes me wonder is that in some images other people, in particular Mike Buckley, see things that I can't see until I use the levels adjustment in Photoshop to increase the brightness of things that are near the black end of the brightness scale.

    An example is the image in this thread. Who can see the texture of the background? I can't without modification of the image. I need some people with good calibrated monitors to tell me whether I am at fault.

  6. #6
    Moderator Manfred M's Avatar
    Join Date
    Mar 2012
    Location
    Ottawa, Canada
    Posts
    22,162
    Real Name
    Manfred Mueller

    Re: choosing the "right" gamma value

    Part of the issue is that you are looking at 10+ year old books.

    So far as I know the information pertaining to Macs was correct until the release of Mac OS X 10.6 towards the end of 2009, when Apple joined the fold and used a gamma of 2.2 as well.

    I will agree with you contention that different computer screens do have different display characteristics, even after a calibration and profiling. I routinely use three different computer screens; my laptop which has an IPS touchscreen and my desktop which is uses a dual screen setup. My main screen is a wide-gamut IPS screen that has a high level of AdobeRGB compliance (~99%) and the secondary screen is an inexpensive TN display. Side-by-side all three screens display colours differently, even though they have been been set up identically.

    Of course, I also do all my editing work in a room that has subdued lighting (in a basement office) and I have neutral coloured walls. I have my screens set up to the standard 120 cd/m2. I use the x-Rite ColorMunki (I retired my rather ancient i1 Pro last year).

    I personally wouldn't care where the LUT resides, as long as it is loaded correctly when my system starts up

  7. #7
    dje's Avatar
    Join Date
    May 2011
    Location
    Brisbane Australia
    Posts
    4,636
    Real Name
    Dave Ellis

    Re: choosing the "right" gamma value

    Quote Originally Posted by bitdepth;594431se"
    @dje
    I was reffering at display hardware capability to adjust the gamma value / the capability to apply 1D LUT, not using the video card but directly on the monitor.
    Anyway, I can use Colormunki Display with DispcalGUI, there is no problem.
    No I don't think you can directly access the LUT in most monitors, either manually or via software. It doesn't really matter though as the profiling process looks at the overall response of the video card and monitor and produces a profile that contains a field called "Video Card Gamma". This is used by the video card to adjust the overall response of the card plus monitor to the desired gamma characteristic. It doesn't matter what the native gamma of the monitor is.

    Dave

    choosing the "right" gamma value

  8. #8
    dje's Avatar
    Join Date
    May 2011
    Location
    Brisbane Australia
    Posts
    4,636
    Real Name
    Dave Ellis

    Re: choosing the "right" gamma value

    Quote Originally Posted by TonyW View Post
    I use the x-rite i1 display pro to calibrate my monitor, an Apple Mac. I have the option to enter the gamma that is used and the general recommendation is to use 2.2, which I do. However, I have been wondering whether that is optimal.

    One thing that makes me wonder is that in some images other people, in particular Mike Buckley, see things that I can't see until I use the levels adjustment in Photoshop to increase the brightness of things that are near the black end of the brightness scale.

    An example is the image in this thread. Who can see the texture of the background? I can't without modification of the image. I need some people with good calibrated monitors to tell me whether I am at fault.
    Tony I had a look at Mike's image (on my calibrated monitor) and can't see much in the background. I don't think you should be changing your gamma setting. As I said to bitdepth, the setting doesn't really matter if the image has an embedded colour profile and the viewing software is colour managed. The colour management will sort out any mis-match in gamma. If you are viewing with non-colour managed software, a gamma of 2.2 is probably the best allround value as most images will be gamma encoded with that value, even if they don't have an embedded colour profile.

    Part of the issue might be that your monitor brightness might be set too low ?

    Dave

  9. #9

    Join Date
    Nov 2011
    Location
    Brisbane, Australia
    Posts
    1,107
    Real Name
    Tony Watts

    Re: choosing the "right" gamma value

    Quote Originally Posted by dje View Post
    . . .

    Part of the issue might be that your monitor brightness might be set too low ?

    Dave
    Firstly, apologies to bitdepth if I am taking his thread off on a tangent but I think what I am talking about is relevant to his questions.

    Dave, my monitor brightness is set to 120 (units? cd/m2?), which is what the calibrator asks for, although I can't get fully controlled viewing conditions as Manfred does. I get the same image whether I open it with the Mac's preview program or Photoshop.

    It seems to me that consistency is more important than actual values so that the person who prepared an image sees it as the viewer does. I suppose that is what the LUT is supposed to achieve. I still have this nagging feeling that I should be sure I am seeing what others are supposed to see and that my own images are being seen as I wanted. If my own system is deficient, would it help to get a different monitor?

  10. #10
    Black Pearl's Avatar
    Join Date
    Jan 2011
    Location
    Whitburn, Sunderland
    Posts
    2,422
    Real Name
    Robin

    Re: choosing the "right" gamma value

    Just a thought - but might it not be an idea to invest in a better monitor if you are concerned about profiling for printing and consistency?

  11. #11

    Re: choosing the "right" gamma value

    @dje,
    creating profile for gamma 2.2 means that 1D LUT will be created. This 1D LUT can be applied in two ways:
    -using videocard LUT (most oftenly 8bit -> 8bit)
    -using display internal LUT (may have much bigger internal precision like 8bit -> 14bit)

    Here comes the problem. Assume that you have an 8bit LCD which does not have gamma correction at all (or does not allow the gamma to be set enough closely to taget) and also does not use editable internal 1D LUT. If you calibrate your LCD for gamma 2.2 the DisplayCAL will make your LCD loose colour depth because it does not offer any interactive gamma adjustment. Therefore your LCD won't even be 8bit anymore (more like 7,5bit, for example).

    If your entire workflow is 16bit there is zero sense in creating 1D LUT. However, since most LCDs have extra contrast, you may set your LCD gamma to compensate for it BEFORE profiling and then create profile for gamma "As measured".
    Creating 1D LUT for video card (8bit to 8bit) will only cut your colour depth.
    (thanks to Pinhollow Euri)

    @TonyW, it's my pleasure, we are here to find/discuss things right?

    Meanwhile I've found an interesting article that approach this dilemma. Please read the chapter Setting gamma correctly. At this point, not so sure if the gamma specification called BT.1886, is taken into consideration only in video editing or can be used in still image editing as well, are there some limitations?
    Last edited by bitdepth; 21st March 2016 at 12:59 PM.

  12. #12

    Join Date
    Nov 2011
    Location
    Brisbane, Australia
    Posts
    1,107
    Real Name
    Tony Watts

    Re: choosing the "right" gamma value

    Quote Originally Posted by Black Pearl View Post
    Just a thought - but might it not be an idea to invest in a better monitor if you are concerned about profiling for printing and consistency?
    Robin, thanks for you comment but i don't want to spend $3000 on a new monitor unless I am sure I really need it. I have been getting good consistency with printing lately with a reliable local commercial printer so that side seems OK.

  13. #13

    Join Date
    May 2012
    Location
    northern Virginia suburb of Washington, DC
    Posts
    19,064

    Re: choosing the "right" gamma value

    OFF TOPIC

    Quote Originally Posted by TonyW View Post
    An example is the image in this thread. Who can see the texture of the background? I can't without modification of the image. I need some people with good calibrated monitors to tell me whether I am at fault.
    Considering that you're still struggling with that image, I'm beginning to feel bad that I posted it. You and I don't live all that far apart, so the next time you're just barely inside the Washington beltway which is where I live, stop by so I can show you the image on my monitor.

  14. #14

    Join Date
    Nov 2011
    Location
    Brisbane, Australia
    Posts
    1,107
    Real Name
    Tony Watts

    Re: choosing the "right" gamma value

    Quote Originally Posted by Mike Buckley View Post
    OFF TOPIC

    Considering that you're still struggling with that image, I'm beginning to feel bad that I posted it. You and I don't live all that far apart, so the next time you're just barely inside the Washington beltway which is where I live, stop by so I can show you the image on my monitor.
    You have no reason for feeling bad - it's not your fault that I am so obsessive and it's not the only example. I'm afraid that it will be a long time before I get to travel near to your place but there must be some way of sorting out the question.

  15. #15

    Join Date
    Nov 2011
    Location
    Brisbane, Australia
    Posts
    1,107
    Real Name
    Tony Watts

    Re: choosing the "right" gamma value

    Quote Originally Posted by bitdepth View Post

    . . .

    @TonyW, it's my pleasure, we are here to find/discuss things right?
    Meanwhile I've found an interesting article that approach this dilemma. Please read the chapter Setting gamma correctly. At this point, not so sure if the gamma specification called BT.1886, is taken into consideration only in video editing or can be used in still image editing as well, are there some limitations?
    Thanks for the link. I have had a quick look and it looks interesting but haven't time to study it properly as I am preparing to leave for our Easter holiday tomorrow morning early.

  16. #16

    Join Date
    Feb 2012
    Location
    Texas
    Posts
    6,956
    Real Name
    Ted

    Re: choosing the "right" gamma value

    Quote Originally Posted by bitdepth View Post
    Here comes the problem. Assume that you have an 8bit LCD which does not have gamma correction at all (or does not allow the gamma to be set enough closely to taget) and also does not use editable internal 1D LUT. If you calibrate your LCD for gamma 2.2 the DisplayCAL will make your LCD [lose] colour depth because it does not offer any interactive gamma adjustment. Therefore your LCD won't even be 8bit anymore (more like 7,5bit, for example).
    Thanks for the link, I now see the underlying concern of the original post.

    But, I still don't understand how adjusting the gamma of a monitor loses "color depth" - which I assume to mean bit depth (as opposed to color resolution). I guess my difficulty is that, for 8-bit, 0 in = 0 out and 255 in = 255 out irrespective of the chosen gamma. Thus, the input bit depth = the output bit depth, again irrespective of the chosen gamma.

    Could you explain in simple terms?

    I can see that color resolution could be affected in the low or the high tones, whichever is appropriate - I can't figure out which but it looks like you could.
    Last edited by xpatUSA; 21st March 2016 at 04:31 PM.

  17. #17

    Re: choosing the "right" gamma value

    @xpatUSA in my previous post, I've mentioned a (nick)name Pinhollow Euri. He is the author of this theory mentioned only in a few places that dig deeper into the gamma setting subject, including here on Cambridgeincolour.com, Limitations of monitor calibration section; (as I've mentioned above here we will find this statement: the more you have to change your monitor from its native state, the more you will decrease the number of colors/shades that it can display)

    Actually that's why I've opened the subject here, in order to discuss in detail.
    I'll try to explain in simple terms quoting Euri again with special thanks.

    The lightness difference is not 1/255 in this case.
    If we are looking at these two graphs, we will see on the right how well the sRGB gamma matches perceptually uniform tone curve. It is close to uniform all way from 50 up to 255. It means that almost every pixel value is distinguishable on properly set up LCD. 50 and 49 is just as different as 255 and 254. On perfect sRGB monitor, at least.

    choosing the "right" gamma value
    Last edited by bitdepth; 21st March 2016 at 06:24 PM.

  18. #18

    Join Date
    Feb 2012
    Location
    Texas
    Posts
    6,956
    Real Name
    Ted

    Re: choosing the "right" gamma value

    Quote Originally Posted by bitdepth View Post
    @xpatUSA in my previous post, I've mentioned a (nick)name Pinhollow Euri. He is the author of this theory mentioned only in a few places that dig deeper into the gamma setting subject, including here on Cambridgeincolour.com, Limitations of monitor calibration section; (as I've mentioned above here we will find this statement: the more you have to change your monitor from its native state, the more you will decrease the number of colors/shades that it can display)
    I think we are just stuck on terminology. I said "I can see that color resolution could be affected in the low or the high tones, whichever is appropriate" and CiC's simple graph with 5 in and 5 out has a "bit depth" of, I guess, 6 - no matter what the tone curve is - so long as the curve starts at 0 and ends at 6. So, I'm happy with your above and happy to let it rest before other folks' eyes glaze over.

    Actually that's why I've opened the subject here, in order to discuss in detail.
    I'll try to explain in simple terms quoting Euri again with special thanks.

    The lightness difference is not 1/255 in this case.
    If we are looking at these two graphs, we will see on the right how well the sRGB gamma matches perceptually uniform tone curve. It is close to uniform all way from 50 up to 255. It means that almost every pixel value is distinguishable on properly set up LCD. 50 and 49 is just as different as 255 and 254. On perfect sRGB monitor, at least.

    choosing the "right" gamma value
    Not certain that the graphs are relevant to our discussion, because the x-axis is in CIE Lightness L*, not bits, so I've no comment on those.

  19. #19
    dje's Avatar
    Join Date
    May 2011
    Location
    Brisbane Australia
    Posts
    4,636
    Real Name
    Dave Ellis

    Re: choosing the "right" gamma value

    Hi again bitdepth

    I'm afraid I don't have the detailed knowledge to comment on how much loss of bit depth you might get when you change the gamma setting on your system from say 2.4 to 2.2. Perhaps you could do some visual tests with suitable test images?

    I haven't seen this issue discussed much anywhere and this may indicate it's not much of an issue?

    Dave

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •