Page 3 of 3 FirstFirst 123
Results 41 to 45 of 45

Thread: Does "Gamma Correction" harm JPEGS quality?

  1. #41
    New Member
    Join Date
    Jul 2021
    Posts
    4

    Re: Does "Gamma Correction" harm JPEGS quality?

    Quote Originally Posted by Manfred M View Post
    Gamma correction is primarily to take the linear output from a digital camera and then to map it to the non-linear human vision.

    I guess I'm trying to figure out why you are changing the gamma off the standard 2.2?
    Gamma correction has nothing to do with the human visual response to light. Gamma correction is simply used to ensure the image is properly displayed (i.e. the same relative intensities that are in the real world appear on the displayed image). Quite simply, since the display device has a built in gamma, the image must have a pre-gamma to ensure an overall linear response. Without gamma correction the system response would not be linear. The eye response to light is inherently non-linear and always sees light in a non-linear manner regardless of gamma.

  2. #42
    New Member
    Join Date
    Jul 2021
    Posts
    4

    Re: Does "Gamma Correction" harm JPEGS quality?

    To reply to the original question, gamma correction will not harm JPEG quality (assuming minimal quantization errors) as this is only a mathematical function. This function is undone on the receiving end (monitor) so the net result is a linear response. The real issue is that JPEG is a lossy compression format and that in itself will reduce image quality. Always best to start with the RAW data (increased dynamic range) and go from there, especially if you are planning on doing any type of image enhancement.

  3. #43
    Moderator Manfred M's Avatar
    Join Date
    Mar 2012
    Location
    Ottawa, Canada
    Posts
    22,257
    Real Name
    Manfred Mueller

    Re: Does "Gamma Correction" harm JPEGS quality?

    Quote Originally Posted by Ronbo View Post
    To reply to the original question, gamma correction will not harm JPEG quality (assuming minimal quantization errors) as this is only a mathematical function. This function is undone on the receiving end (monitor) so the net result is a linear response. The real issue is that JPEG is a lossy compression format and that in itself will reduce image quality. Always best to start with the RAW data (increased dynamic range) and go from there, especially if you are planning on doing any type of image enhancement.
    If you look at the history of the gamma curve, you will see that human vision was one of the key drivers of why the curves were used. The history goes back to the early days of broadcast television, when engineers were looking for a way to improve television quality in over-the-air broadcasts.

    They knew that human vision was most sensitive to the mid-range wavelengths of light and less sensitive to highlights and shadow detail. By applying a gamma curve, they were able to boost the part of the signal that had important image quality and downplay the things that human vision could not differentiate as well. Nicely said, why waste broadcast bandwidth on wavelengths that the human visual system cannot differentiate particularly as well as for wavelengths that they can?

    I have read that the "undoing" of gamma curves was done in the analogue CRT television receivers was largely accomplished through the selection of phosphors used to display the image. I suspect that this may be an over-simplification, especially in later generations of television receivers, but regardless it appears that the CRT televisions and by extension, computer screens had a non-linear response and which resembled a typical gamma curve. In the early days Apple used a gamma of 1.8, while everyone else used a gamma of 2.2. Everyone now uses a default of 2.2. My understanding is that the 1.8 came from the offset press printing industry and as we moved away from these processes, Apple hardware moved over to the 2.2 gamma.

    With the introduction of digital displays, these devices were linear, so we have a situation where a gamma of 2.2. is applied for content (for historical reasons) while the screen technology / display drivers have to compensate to correct for the gamma. It is interesting to note that the Adobe "Melissa" colour space is effectively the ProPhoto RGB colour space with a gamma of 1.

    In terms of compression on JPEG, I would agree and disagree with what you have written. Most modern cameras produce 14-bit output (some medium format ones are at 16-bit). JPEGs are downsampled to 8-bit; so most of the file size difference between raw and JPEG is nothing more than data loss from this process, especially at the higher quality settings. Given that most computer screens use 8-bits per channel (some higher end ones can handle 10-bits per channel; which is more important in the Adobe RGB compliant screens), we can't actually "see" the quality difference unless we start pushing the data hard in post and resulting artifacts start showing up.
    Last edited by Manfred M; 6th July 2021 at 06:01 PM.

  4. #44
    New Member
    Join Date
    Jul 2021
    Posts
    4

    Re: Does "Gamma Correction" harm JPEGS quality?

    Thanks for the quick response. Interesting topic and one that continues to cause a lot discussion Just to clarify a bit (I think), historically pre-gamma was added to the front-end of video cameras to increase the s/n ratio as noise was a significant problem with early cameras when trying to capture low-level light. This pre-gamma did in fact consider the human visual response to light since our eyes are more sensitive to changes in low-intensity light than they are to changes in high-intensity light. Of course, to maintain a linear system, gamma was then needed at the backend. By coincidence, the non-linear response of the CRT electron gun (not phosphor) did in fact exhibit the necessary non-linear response such that the overall "system" gamma was approximately one (linear). The designers of early TV systems took advantage of this coincidence since this ultimately reduced the cost of their systems.

    Today of course, we do not need pre-gamma to increase the s/n ratio of a low-level analog signals but 1) we do need pre-gamma to perceptually encode the 8-bit intensity signal to minimize contouring and mach banding, and 2) we need to be backward compatible with prior technology. Since we are still adding pre-gamma (although for somewhat different reasons), we then need to once again add gamma on the backend to ensure a linear system. Modern displays exhibit "mostly" linear outputs so we artificially add gamma (~2.2) to our displays. Again, the "system" gamma needs to be approximately one if you want a faithful reproduction of the real world. Of course there are reasons to vary slightly from a "system" gamma of one but that is beyond the scope of this thread.

  5. #45

    Join Date
    Feb 2012
    Location
    Texas
    Posts
    6,956
    Real Name
    Ted

    Re: Does "Gamma Correction" harm JPEGS quality?

    Quote Originally Posted by Ronbo View Post
    ... Always best to start with the RAW data (increased dynamic range) and go from there ...
    Welcome to the forum, Ron.

    I was wondering how raw data has an increased dynamic range compared to a processed output, e.g. a JPEG.

    I am familiar with ISO 15739 but that didn't help much. (That standard has been criticized anyway on the basis of signal equal to temporal noise at the low end of the SNR characteristic.)

    Trawling through Imatest today got me a link to an EMVA standard which (A) is free and (B) is far better than ISO for camera-related stuff, IMHO, albeit intended for machine vision devices.

    https://www.emva.org/wp-content/uplo...A1288-3.1a.pdf

    It still didn't help me with the subject of this post but it is provided for your information.
    Last edited by xpatUSA; 9th July 2021 at 01:16 PM.

Page 3 of 3 FirstFirst 123

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •