As I recall, there wasn't nearly as much confussion when we used film. :);)
Printable View
Yes - many don't seem to realise that the camera is capturing several more stops of information than any monitor or print can display without dynamic range compression.
Maybe someday some one will do a zone system for a PC screen.[/QUOTE]
To be honest, I think people using the "zone system" are possibly trying too hard; it's great in theory, but in practice - with the large DR of modern sensors - I feel it's far less relevant today so long as the exposure is relatively appropriate.
Not so sure about that Colin as after a fashion some one has already done a zone system for a PC screen and the comments for each zone aren't exactly true but do indicate trends. See
http://en.wikipedia.org/wiki/Zone_System
It shows zones on a pc screen. Some software packages use zones as a form of luminosity mask as well.
Problems come when some one starts talking about stops as zones and I'm afraid modern high dynamic range sensors. They have improved over time but 11 EV. No way. It's all irrelevant anyway unless all of these quoted dynamic ranges can actually be manipulated sensibly onto a PC screen.
John
-
My personal opinion is that people who do that are getting a wee bit too caught up in the theory, at the expense of "real world" photography. Normally a sensor will capture more than enough shadow detail without needing to blow the highlights -- and at that point they may just as well process the shot visually. If they're shooting into full sun then it may well be a different story, but (in my opinion anyway) that just comes down to a bit of experience - a bit of experimenting - and perhaps a bit of bracketing to get what they're wanting.
In many ways folks can analyse things all they like, but at the end of the day, it is what it is -- and so long as their exposures are in the ballpark then it should be easily adjustable. Can't say that the zone system plays any part in my photography, and my shots seem to all turn out OK.
The real problem is books and the web. They are basically a source of income. They are full of instant fixes - do this and all will be ok.
:) It's interesting to compare now with when just film was around. Most people had taken photographs with some sort of camera at some point. The paperwork that came with the camera or was in the packet of film would mention any settings needed if the camera had any settings. Cloudy, sunny etc. Some might progress to a camera with an exposure meter. The manual would generally say something like add a stop or two if back lit etc. For many people the chemist would take care of the rest. DIY - buy the kit needed and read the instructions on the chemicals. I did buy a book at that point as I was interested in toning. Very thin and nothing but examples - no signs of lots of confetti to meet publishers word count requirements. :rolleyes: The world has changes a lot since then. There are far more people seeking income in this sort of area now and loads of people prepared to buy or click. Personally I feel that unless people get a bit of a grip on what is going on they are likely to be miss lead and worry about all sorts of things rather than finding out what their camera can do. Having some understanding of what is actually going on underneath it all can help with that aspect. :) At least then it's possible to realise why shots turn out as they do.
I can remember Colin weakening on the zone system when spot metering was being mentioned - "might take a reading and then correct based on what zone the bird should be in." It's still a useful concept for discussing this sort of thing and for what exposure is meant to achieve and the problems.
John
-
But where else can you find information? My neighbour doesn't know anything about photography.Quote:
The real problem is books and the web. They are basically a source of income. They are full of instant fixes - do this and all will be ok.
George
What I am finding of interest is Where is Floridaboy since the 12th post?
Cheers: Allan
You could try this site.
Thanks. Looks good but is also on the web.
GeorgeQuote:
The real problem is books and the web.
I'm not so sure.
I hope FloridaBoy is coming back. He is for sure not the only one that has problems with this. I have seen this several times in this forum and not allways with good answers. Even in the dynamic range tutorial on the site you mentioned is the stop used in the tonal range. And the contrastratio which is in my opinion a property of the display and not of a file.
I see the copy doesn't work that well. Understanding dynamic range.https://www.cambridgeincolour.com/tu...amic-range.htmQuote:
Bit Precision
of Analog/Digital Converter Contrast Ratio Dynamic Range
f-stops Density
8 256:1 8 2.4
10 1024:1 10 3.0
12 4096:1 12 3.6
14 16384:1 14 4.2
16 65536:1 16 4.8
George
If you read all of it as usual it's better than most - measuring the steps etc as Colin pointed out some where above. The A/D doesn't directly relate to the dynamic range of the sensor. It also quotes some dynamic range limits for cameras. Certain medium formate cameras may well have greater dynamic range at the sensor level than even any full frame camera will ever reach. Even that has to be qualified. Sony for instance have recently gone backwards and produced a 12mp full frame camera concentrating on ultra low noise for very high light sensitivity. Hope I remembered the pixel count correctly but lower noise implies higher dynamic range.
The aspect that makes life difficult is the numbers themselves. As 8 bit is nice and short they are 0,1,2,4,8,16,32,64,128 and 255 as there is no 256. There is a huge difference between the counts at the bright end to the dark end and noise counts far more at the low end than the high although most modern cameras have chroma noise through out the tonal range at high iso settings - say over 1000 on a well regarded crop camera used on an average scene. Shift to 10,12 and 14 bit A/D's and all that happens is the the numbers go up by a factor of 4 each time - noise included. This is the sensor nothing to do with viewing the image. The net result is easily seen on any camera. Take a shot exposing to avoid clipping a bright cloudy sky when slight shadows or even slightly darker areas can be seen in the scene. Our eyes adapt to this rather well. Our f stops changes as we look around. The camera can't. When you PP the shot any darker areas will be a lot darker and it will often be impossible to brighten them up to the level that was seen by eye as the colours shift etc. All sorts of odd things are going on when a shot is processed from raw. On my E-M5 for instance the RGB channels appear to be multiplied by factors of roughly 2.1, 1 and 1.6 as the transmission of the filters on the sensor sites varies or to obtain correct colouration. Even the readings generated by light is likely to vary with colour. These factors change as the colour temperature is changed. As would be expected if i brighten dark areas red artefacts usually show up 1st. Not something funny Olympus do as Nikon use similar numbers. On the face of it red has 1/2 the dynamic range of green and as it has been boosted up twice as much noise. More problems with little numbers from the sensor.
:D What I reckon reviews should do is take a photo that needs the highest claimed dynamic range and show it full sized using the raw curve they used to obtain it. Otherwise the information is mostly just comparative via the other results they give.
Who needs HDR techniques if cameras can cope with 11 EV or even 9 well really. Why is there a need to bracket maybe +/- 3EV for HDR. Why does +/-1 usually makes no useful difference other than maybe getting the exposure more. correct. If there was really 11 there many scenes could be recovered without that bracketing and errors of 1 EV or even more in the right direction wouldn't matter in many cases.
John
-
I don't know what you are trying to say.
This thread is over dynamic range and the wrong assumption you can split the tonal range in stops.
Dynamic range is the range in usefull lightness the sensor can accept. Photograps use stops to quantify that range.
An A/D converter does nothing else as translating an analogue input in a binary output.
The tonal range is the quantity of steps the output is divided in, as an power of 2.
Noise is all over the sensor, as much in the darker parts as in the brighter parts. But since noise is bright, it's more vissible in the darker parts.
Read that tutorial. On the dynamic range it's so wrong. Or I miss the point.
I found an article with a wonderful image that visualizes the correlation between dynamic range and tonal range.
http://dpanswers.com/content/tech_zonesystem.php. If people don't want to read it all, here is the image
http://dpanswers.com/images/zonewedge.png
Yes, this is internet.
George
Spain-Netherlands 1-5.
My last post was about claims that some cameras have an 11 EV or more usable dynamic range. Exposure values correspond to either stops or factors of 2 change in ISO or speed and also a light measurement of sorts. Metering and focusing capabilities are often expressed in EV -2 and numbers like that crop up. If EV didn't have some specific meaning light level wise exposures wouldn't be consistent across cameras - :rolleyes: Unfortunately due to different practices eg grey levels they may not but just forget that for a bit.
The original post was more sensible. An EV corresponds to a certain power of 2 count on the A/D giving a total range of 9 EV and it seems this doesn't work to the OP's expectations. I don't find that surprising and tried to explain why. One of the no detail entries is also rather odd.
I've covered why sensors have problems at the dark end, noise, low number counts and the profiles needed to actually see the EV's. Another way of looking at this area and go further is to forget sensors and just consider TFT type screens as it too has problems.
The 8 bit output of these goes 0,1,2,4,8,16,32,64,128,256. 1st problem thanks to back lighting is that black on these screens is above 0. Lets say it corresponds to 4 so black is shown with a count of 4 or lower. Next quest is what increase in count produces the visual appearance of a 1 EV increase in light output. Might be 8 but the difference between 4 and 8 is only 3 additional shades Maybe the next EV is actually 10, much better there are now 5 additional shades. 12 even better 7 additional shades. The important thing is that what ever the change in counts are they need to give an apparent increase in output of 1 EV and show shades in between realistically. Loosely speaking the next EV step is twice the previous one. Maybe cameras and screens use actual eye response.
Switch to a sensor and the same applies. 1 EV could correspond to any count out of the A/D and then increase by powers of 2 to cover what ever dynamic range the camera can cover. The camera is calibrated in EV steps. One EV step at the bottom end could in principle be any number. It doesn't have to be a power of 2. The A/D just produces a number. There is nothing purely binary about that aspect only the range. These numbers are then scaled to fit in with EV's and colour balance. There is nothing conveniently binary about the E-M5 colour scaling factors I mentioned. If the camera is going to produce colour balanced jpg's it has to use them as well. It can also pick any scale it likes to represent EV's just as a camera profile can when raw is developed.
So going back to CinC's staircase. The sensor produces an EV staircase. The difference between them is set by the sensor. The step sizes in all probability even change with exposure level. The A/D measured them and produces some number. Software then corrects for colour and EV's.
Conveniently binary - some numbers and arithmetic methods can be done very very quickly and easily in binary some can't. The stops in the OP's post are very binary convenient which is probably why this sort of thing is often used to describe this area. Throw in even just colour balance and it all falls on it's head let alone obtaining a calibrated EV response.
John
-