Hello all,
I've been viewing many posts here for while and I'm confused to what HDR is.
I've seen all types, some photo's are every day pics and some more painting looking.
Can someone please explain.
Hello all,
I've been viewing many posts here for while and I'm confused to what HDR is.
I've seen all types, some photo's are every day pics and some more painting looking.
Can someone please explain.
Hi Rodders,
It's a pretty easy concept when you get your head around it ...
There is a limit to the range of brightnesses that our cameras can capture in any one shot (called the "Dynamic Range"). If you want to capture a scene that has an even greater range of brightnesses (say bright sun and shadow detail inside a doorway) then it's not possible to capture the whole scene in 1 go ...
... if you setup your camera so that the brightest parts expose correctly, your shadow detail will be lost - but if you setup your camera so that the shadows expose correctly then the highlights will be too bright, and will "blow out" (go to pure white).
To get around this we take a series of shots - at different exposures - and then combine these into a High Dynamic Range image (or more correctly, we take a high dynamic range source and tone-map it into a normal dynamic range image that we can display or print).
Tone mapping is the key to processing the image into something that can look normal or into something that looks more "arty" - either way it's still HDR (so please don't get into the habit of associating the "Arty, ultra-tone-mapped" look as always being an HDR image; you can get that look with a normal dynamic range image - and true HDR images can also look like normal range images, so the two aren't really related.
I wrote a bit about tone mapping the other day, which should help to explain it a bit more for you.
Does this help?
HDR or high dynamic range photography is a process where an image contains more luminance detail than traditional photography.
Human vision has a dynamic range of roughly 1:1000,000,000, yet the dynamic range of a JPG image file, which uses only 8 bits per channel only yields 1:255 - quite far from what your eyes can achieve.
In HDR photography, multiple images with differing exposures, for example -6EV, -3EV, 0EV, +3EV, +6EV, each with a limited dynamic range, are combined by HDR image processing software to achieve a single image spanning the full tonal range of the parent source images. The goal is to achieve a look more in line with the range of human vision.
This is great if you examine and view the HDR image in a software package which can achieve some degree of tone mapping, but since most computer displays and web image formats are limited to 8 bits per channel, the dynamic range ends up being compressed and details lost.
If I posted a HDR image here, it would be rendered with the traditional 1:255 dynamic range. Apart from the level of detail between dark and bright areas of the image showing evidence of all being properly exposed, an HDR image would otherwise look like any other.
I'm starting to wonder if HDR stands for "How Daves Relax" givern how many Daves we have around here.
So if i understand this correctly, my monitor is capable of displaying a very high brightness but at the expense of excluding a lower one because it has to stay with in the 0 - 255 range?
What's the brightest that a monitor can display then?
Whilst technically technically technically accurate, in my opinion it's a really dangerous way of looking at it.
Dynamic range is a RATIO, not a level. Specifically, the ratio of the level of the brightest detail that can be displayed or captured compared to the darkest detail that can be displayed or captured, often expressed in stops.
With paper, it's usually around 4 stops - with monitors it's usually around 5 or 6 (but up to around 11 with speciality monitors) (very expensive) - and with cameras (when capturing RAW) it's often around 11 to 12 (at their base ISO).
Increasing the brightness of a monitor (by using a more powerful backlight) unfortunately, also increases the level of the "blacks" because at "maximum light blocking" (whatever the technical term for it is), some light still gets through (if all light were absorbed you wouldn't be able to tell that the screen was even there).
In making the reference to 1:256 above, it's just the ratio of the number of steps that each pixel of the monitor can respond to; normally the lowest level corresponds to a dark gray, and the 256 end somewhere in the vacinity of 200cd/m2 - but - nothing says that "1" couldn't correspond to a black hole, and 256 something brighter than the sun.
Does this help, or have I made it worse?
Hi "Reign",
Firstly, welcome to the CiC forums.
Secondly I'll admit I don't have the fullest of understanding of the technicalities, but my simplified explanation may nevertheless help.
One thing to remember is not to confuse the absolute maximum brightness a monitor can display (in say candle power - as you'd measure the brightness of a torch bulb), with anything else. Maybe you were not, but just to dot the i and cross the t, I thought I'd mention it.
The other thing to grasp is that the lowest or dimest thing it can display, or more accurately, that you will be able to see, is very dependent upon your viewing conditions - in the extreme; we all appreciate that watching TV with sun shining on the screen is almost impossible. The same applies to your computer monitor, even in less bright conditions; for the absolute best range, you need to be in almost total darkness to critically assess the shadow areas in an image.
Most people don't go this far though, at least not amateurs. However, the serious ones will take control of the illumination in the room while they are editing pictures because if there is light, it will have a colour temperature, and if it is daylight; this will vary depending on whether it is sunny or cloudy, or at night, when it may be flourescent or tungsten. Without such control, they cannot hope to be consistent in their white balancing of images if doing it by eye (rather than using the eye dropper on a known reference in shot).
There is just SO much possible variation dependent upon monitor brand and the interface between computer and monitor (which can be digital or analogue) that any discussion of bit ranges is dangerous territory, for me at least .
Not sure if that helps,
PS Colin is much better at this than I
Last edited by Dave Humphries; 22nd July 2009 at 08:36 AM. Reason: Now I have just seen Colin's reply
Regardless of the dynamic range each medium has, the critical issue is understanding within what range you can see sufficient detail. Everyone says and tries to keep histograms in the middle or just right of middle without extending to either extreme. HDR is a method to extend this useful range by simply expanding the number of available brightness values. A trade off eventually ends up mixed in along the way when the HDR image must be scaled or gamut mapped to the display medium, whether for some form of monitor or hard copy.
If you have or are familiar with the book The Negative, Ansel Adams explains the zone system of exposure where the exposure is chosen to ensure the the desired areas of the subject of a scene have sufficient detail.
This may sound redundant and look like two ways of explaining the same thing, but there is a subtle distinction. One is a way to have your cake and eat it too, the other maximizes the capability of the method being used.
Now, for something practical and to look at.
A while back, I shot and processed a series of infrared exposure tests performed with my Sigma SD14. While processing the photos in Sigma Photo Pro (SPP), I noticed that a ±1EV exposure change performed in the SD14 did not correspond to an even and linear step of the SPP's histogram. A histogram peak near 128, middle tone, made a large displacement while peaks towards the upper or lower ends made small displacements for the same ±1EV exposure change. From this behavior, I realized that the mapping of camera exposure to software image space follows a base 2 log scale with 0 being represented by 128.
To get a better sense of how dynamic range is handled by various output media, i.e. monitors, printers, etc, and the mapping between camera exposure to recorded image, I decided to create a brightness scale which utilized base 2 log steps corresponding to the behavior seen in my IR tests:
Exposure.jpg
Here is the Exposure Scale uploaded to the forum in JPG format
Exposure.png
Here is the Exposure Scale uploaded to the forum in loss-less PNG format
(Left Click the thumbnail to call up a full size view. From there, you can save with a right click on the image.)
First off, I doubt anyone will see much distinction between brightness levels at the extremes of this scale and even within the middle tones near the center, most will not have far to go before a brightness level looks like it's neighbors. For this HDR discussion, this is exactly what we want to be able to see.
I created this exposure scale in Paint Shop Pro, version 7.02, since I wanted the ability to work with the exact numbers with out any bias contributed by gamut mapping, gamma, or color management. The values used in this scale were calculated as floats and rounded to the nearest integer. This is why the center 0EV square has a value of 128. ( The scale maximum 255 ÷ 2 is 127.5 and is rounded to 128 ) Any mathematical inconsistencies are due to this rounding. Note that the rounding was only performed once and as the last step of computing each value, as a means to minimize accumulated round off errors.
I computed the brightness values by starting from the middle gray center brightness square, with an image value of 128. The scale follows different functions when transversing to the left and to the right. When heading to the left, each successive square is ½ the brightness of it's predecessor, as seen by the values of the Brightness Increment located at the bottom of the scale. This function would not make much sense when heading right from center, since a 2X doubling of 128 would be 255, the highest possible value. Instead, each successive square's brightness is only increased by ½ the brightness of the corresponding square on the negative side of the scale. Just compare the Brightness Increment heading left and right from the center 0EV value.
When viewed through your web browser or downloaded and viewed with your favorite photo software, this scale will allow you to see the effective dynamic range each medium provides. You will easily see just how small a step in brightness that your monitor can resolve and the ranges where image detail is maximized or lost. In the histogram, the brightness squares are the sharp spikes. The cyan text contributes to the other histogram content. The photo software's histogram will also show the positions of the brightness squares, allowing you to "calibrate" the histogram to exposure. Starting from the center 128 brightness square, just transverse left and right until you no longer can discriminate any differences. It's between these lower and upper limits, which may be well within your histogram, that any aspects of a image which are brighter or darker end up lacking detail.
On my Acre monitor and viewed with Firefox with Color Management enabled, my limits are on the dark left side is 16 and on the bright right side is 247. That means an image with any values between 0 and 16 end up as black and any between 247 and 255 end up as white.
I have also viewed this exposure scale in Photoshop and the histogram matches that displayed in SPP. Since I'm brand limited (my only dSLR is a SD14), you'll have to verify how well this scale matches the performance of your equipment. Since the range of software I've used demonstrates similar histogram interpretation, I expect that it will be close.
When you print the Exposure Scale, you will be able to see the same performance parameters of your prints. It does not matter if adjusting a RAW or HDR image just right to maximize dynamic range to ±7EV if your printer will only show sufficient detail for ±5EV.
A surprise that I had when I finished creating this Exposure Scale was seeing an imbalance of my monitor where the left of center fell off to black faster than the right of center did falling off to white. This shows that I need to adjust my monitor's brightness and contrast for a better balanced dynamic range. I'm suspecting that this may be a factor in the "Dark Print" problem being discussed in another thread.
Last edited by Steaphany; 22nd July 2009 at 04:01 PM.
Thanks for welcoming me and thanks for taking the time to provide in depth explanations, I really appreciate it. I'm still trying to take it all in but several questions did come to mind.
Colin - I think I may have developed an incorrect understanding of "range", could you elaborate more on the "ratio" concept? You're implying that the brightness of a black hole and a sun can exist within the same image? That really throws me off, lol. I think I'm really missing something here...
Dave - What you see is relative to your surroundings.. very interesting!
Steaphany - I'm still working on yours, lol
Yep - they sure can, although of course we wouldn't have a way to display the image.
There's really two things going on here ...
Think of dynamic range in terms of the ratio of the brightest to the darkest detail that we can capture or display. It's pushed to the limit with images like this one of mine ...
... the setting sun still has a lot of power in it, and yet the face of the rocks towards the camera are in the shadow area, so the ratio of the brightest detail (sun just below horizon) and darkest detail (gaps between rocks in foreground) is right on the limit of what the sensor can handle (although in this case I bought myself 3 stops of safety by using a GND filter). (keeping in mind though that your looking at the finished result here, not the unprocessed originals). In a way this range can be compared to the height of a staircase.
The other thing that often sneaks into these types of discussions though is
"the number of steps on a staircase", and frankly I hate it because it's not relevant (it's really only relevant in discussions of analog to digital converter resolutions). So in terms of defining things with 8 bits (as used by traditional jpegs) there's nothing to stop level 1 representing the level of light eminating from a black hole - level 2 representing the power of the sun - and level 256 representing something with 256 times the power of the sun ... it's just a representation system where 1 and 256 represent each end of some range - what the absolute values actually are, isn't important. In terms of photographic display of "real world" images, all you need to know is that the human eye can only differentiate between around 200 levels on a monitor or (presumably) on paper - but it's not relevant to dynamic range discussions.
Again, in real-world terms, HDR is a capture challenge (something we can use a variety of techniques to work around) - it's also a display issue, but nothing us mere photographers can do about that
Reign,
Working with photography, you probably already familiar, to some extent, with ratios.
For example, if your camera's shutter is set to a 1/200th and you shorten it to 1/400th, ½ as long, you have reduced the exposure by 1 Exposure Value ( EV ). Like wise, an increase from 1/200th to 1/100th is an increase by 1 EV.
Our eyes possess a dynamic range which greatly exceeds our cameras. For example, try to photograph a dark room with bright exterior light from the windows where the room and what's seen through the windows is neither under or over exposed. The problem with such scenes is the darkest and brightest aspects of the scene exceed the dynamic range of the camera and/or file format.
Based on the calculations of my posted 8 Bit Exposure to Image Value Scale, I've done some calculations on the dynamic range an image has based upon bit depth:
Remember, a ±31 EV corresponds to a ratio of brightest, +31 EV, to darkest, -31 EV, that the bit depth can support.Code:8 Bit Integer ±7 EV JPG 12 Bit Integer ±11 EV Cameras with 12 bit ADCs 16 Bit Integer ±15 EV 32 Bit Integer ±31 EV 64 Bit Integer ±63 EV
I have qualified these calculations for images containing integer values. Some specialty image processing programs, like AIP4Win is capable of floating point image manipulations and file storage. For example, AIP4Win performs all calculations with 32 bit floating point math allowing a maximum image values of ±1E32 with a image brightness resolution as small as 1E-32. For stored files, AIP4Win supports up to 64 bit signed integer and 64 bit floating point bit depths.
Last edited by Steaphany; 23rd July 2009 at 05:47 PM.
Hi,
I published a post about HDR some time back on my blog. I'm definitely not a professional photographer but here's a budding photographer's understanding of the whole concept:
HDR stands for High Dynamic Range and it is an art of photography that is getting popular in the world of digital photography these days. A regular photo taken with a single shot has a limited dynamic range. This is how I understand it: imagine the darkest and brightest tones possible in this whole universe representing the furthest left and right corners of a histogram respectively. A camera's sensor is not able to capture the entire range of tones on this histogram. There is a point on the left beyond which everything is recorded as black and likewise, there is a point on the right beyond which everything is recorded as white. The range of tones between these two points is the sensor's dynamic range.
In a bright scene, the dynamic range shifts to the right and correspondingly in a dark scene, the dynamic range shifts to the left. The problem occurs when there is a scene with high contrast - some parts are very bright and others are very dark. With regular photos, setting the camera to correctly expose the bright parts will cause the dark parts to be under exposed. On the other hand, setting the camera to correctly expose the dark parts will cause the bright parts to be over exposed. Setting the camera's exposure in between the two has been the best solution before the advent of HDR photography. It resulted in slightly overexposed and underexposed parts but it was the lesser of evils.
Read more...
Hi Joel,
The only "nit picking" comment that I'd like to make (but fundamentally it is very important in a lot of discussions) is that we currently have no way to actually display an HDR image; the techniques of capturing a HDR scene are well understoood - but what pops out the other end of the program is always a low dynamic range image (4 to 6 stops) (ie the same as every other "normal" image).
So in essence it's a 2 part process: In part one we capture the required dynamic range with multiple shots (ignoring other techniques like GND filters) and then in part two we take this computer representation of the high dynamic range capture and tone-map / compress it into a "normal" (ie "low") dynamic range output that can be printed (about 4 stops max) or displayed on a monitor (about 6 stops max) (unless you happen to own something like a BrightSide DR37-P HDR display).