ajohnw, thanks for the link to Wikipedia. That article contains a sentence that isn't quite right, and feeds a lot of the misunderstanding:
(My emphasis). Some people latch on only to the first part of that sentence, which I've highlighted. It's wrong, anyway. Gamma encoding is not required, but it's useful when encoding in 8 bits or less. It's largely irrelevant in 16 bits.Gamma encoding of images is required to compensate for properties of human vision, hence to maximize the use of the bits or bandwidth relative to how humans perceive light and color.
Gamma encoding does not "compensate for the properties of human vision", it simply encodes images in a way that allows fewer bits to be used for a given signal to noise ratio. It's a piece of engineering expediency. A tone curve is applied before encoding, and that tone curve has to be removed (by applying the opposite tone function) before display.
A second purpose for using a tone curve (not usually exactly a gamma function) is to compensate for non-linear response in output devices (monitors).
These two unrelated purposes get confused as the tone curve applied before encoding is usually a gamma curve (though not with sRGB), and the response of CRT monitors is (quite coincidentally) often approximately a gamma curve.
Both these functions are techy, engineering functions. Ideally they would be completely hidden from photographers, and we shouldn't even need to be aware of them. In fact, with colour management, we don't need to know anything about it. Colour management deals with it, and we can forget all about it.
PS - I've just edited that sentence on the Wikipedia page, so it now reads:
Let's see if that change sticks. I think it's more accurate.Gamma encoding of images is used to optimise the usage of bits when encoding an image, or bandwidth used to transport an image, by taking advantage of the non-linear manner in which humans perceive light and color.