I don't mean to make complications, but as well as your response was informative, it was also confusing on this matter: I don't think I'll try to print images myself seriously, any time real soon, so I have been saving my files in aRGB, (though I have an SRGB screen, I assume, ) because this printing service seemed to say that files should be sent to them in aRGB , here http://www.fineprintimaging.com/abou...rint_ready.htm, besides, it seem odd that pro services use sRGB if while individuals can print in aRGB and get 50% more colors, as you said here:From a printing standpoint, most commercial printers are sRGB only, so unless you are planning to print your own on a colour photo printer, you are going to have to output your images as sRGB or again; AdobeRGB images witll look muddy printed by a commercial printer. I generally print my own, so I don't have this restriction, and yes, you can see the difference in the final product. The AdobeRGB print can produce roughtly 50% more visible colours than sRGB and these tend to be the more brilliant colours, so it can be quite apparent in your end product.If you can save files as aRGB even on an SRGB screen, then the only advantage to having an aRGB screen would be being able to work with those extra colors during editing. On the flip side to this, if you had an sRGB monitor, and had a service print in sRGB, you wouldn't be missing the aRGB, because you couldn't make use of it, right?AdobeRGB images witll look muddy printed by a commercial printer. I generally print my own, so I don't have this restriction, and yes, you can see the difference in the final product. The AdobeRGB print can produce roughtly 50% more visible colours than sRGB and these tend to be the more brilliant colours, so it can be quite apparent in your end product.
I hope I am making enough sense that you can understand where I'm getting these questions.
I recently went wide gamut with a Dell U2413 (IPS) and placed my Dell 2320L (TN) next to it for a dual monitor setup. The argb screen is noticeably better, brighter colors, more saturation, and better resolution. The viewing angle of an IPS screen actually meets its claims unlike a TN screen. If you print to your own inkjet you should have a wide gamut.
In that case you have found one of the less common commercial printers that can and do handle AdobeRGB. If you like their work and prices; great as your prints will look better.
This is an area where a lot of poeple have different views; let me give you mine.
If you can take the data that your camera has recorded and it includes 100% of the colours the a person can see, why would you go ahead and throw out 1/2 of that data (AdobeRGB) or 2/3 of that data (sRGB). To me it's a bit like buying a full-frame camera and shooting in crop-frame mode. The excuse from many photographers is that (a) you can't see them, so why bother and (b) because you can't see them, you might make a mistake. Garbage in my opinion (I do all my edits in ProPhoto).
Basic reasoning is that you should not throw away any data until the very last step or so of your workflow, as once you have thrown it out, that data is gone forever. So, the things I will do last in my edits is cropping (to screen or print size), output sharpening and downsampling to my output workspace. I will view my downsampled files and may end up doing a bit of colour tweaking to partially compensate for loss of colours (generally by increasing the vibrance or saturation (sometimes both).
The main arguments I've seen against using this workflow is that one might make a mistake and forget to convert to the proper colourspace or you may get colours from downsampling that are somehow "wrong". Good workflow habits prevent the first issue and I haven't seen the second (although I do have the example in a previous response where I don't like the image as much due to colour loss).
The colour aspect of aRGB and sRGB is a bit odd Nick. The tutorial on CinC explains it well. Both have the same number of colours but the aRGB ones are more widely spread. This relates to bit depth of the colour channels. The way the specifications are laid out for these means that the bit depth can be changed but as far as images that can currently be posted on the web are concerned both use 3 8 bit colour channels. There are 2 view about what aRGB was intended to do. One is maximise the colours that can be printed using CMYK printing. The other which goes back to another standard that it was developed from - complete it seems with a mistake - is that it covers the largest gamut 3 8 bit channels can cover without noticeable colour banding. For instance this means that there can be 255 steps of each pure colour. The algorithms used could also cope with 3 4 bit channels but that would only provide 16 and banding would be obvious.
The colour steps in sRGB are smaller than those in 3x8 bit aRGB so essentially it contains some colours that aRGB can't reproduce. sRGB was specifically designed for display on a monitor - mainly a cathode ray type.
Currently it seems that people can display 3x10bit colour channels on their aRGB monitor but as far as I am aware they can't post this on the web but there are image standards that would allow this to happen. There are numerous image standards, tiff, jpg, jpg-2000 etc and even jpg lossless. The one that may come to the fore eventually is jpg xr as it seems lots of things support it but cameras don't produce it directly. Adobe products and some others already can. Windows explore it seems can also display them as can Linux. There is also another one that uses 32 bit floating point colour channels. One things for sure this area is still up in the air and might change to all sorts of things at some point.
Most monitors are sRGB so if some one does post an aRGB image colour management in browsers converts it with a catch. The aRGB image can contain colours that are out of the sRGB gamut so the extra colours have to disappear. Conversely if some one displays sRGB images on an aRGB system some colours have to disappear as well however if the aRGB system had 10 bit colour channels this aspect would probably be irrelevant but colour management would still have to do it's stuff.
If some one works in aRGB on an sRGB screen strange things happen as the aRGB numbers map to totally different colours in the sRGB gamut. It may be possible to PP the image to have the correct appearance in sRGB but usually if done well it can result in a sort of Constable painting type effect. The term flat images is more usually used to describe it.
My personal feeling is that the whole area is a mess and most monitors are sRGB by a huge margin. The mess may get sorted out at some point but it will in all probability mean new equipment for many and I have a feeling that the "standard" may not be aRGB in it's current form. Most display panels can not produce anything other than 8 bit colour channels without what is called dithering which puts me off. Basically they flash 2 colours to obtain the appearance of another. A personal thing. Once 8 bit colour was produced the same way. I saw mention of going wide gamut from a TN display. A huge difference would be seen moving from TN to a decent modern display without the gamut increase.
Most commercial printers only accept sRGB images. As I am unlikely to ever buy a printer that can match their gear again I don't see any point in aRGB.
People sometimes mention throwing info away. The aim of pp is to move a deeper colour gamut from a camera into another shallower gamut that even has less dynamic range. PP doesn't need to ever throw that away. It's adjusted to give the required appearance in the chosen colour gamut. In that respect it doesn't really matter if it's aRGB or sRGB. This touches on why things are likely to change. It's perfectly possible to accurately display a gamut at all sorts of bit depths. All that happens is that the colour steps will be greater on displays with lower bit depth than others. Providing that doesn't give objectionable results it doesn't matter. The limit on this at the moment seems to be 3x8bit covering an aRGB gamut but the way things work the input could equally well be 3x16bit and still display "correctly" one a 3x8bit display that covered the same gamut. It would also display on a 3x16 bit display - if that was possible.
John
-
John - let's agree to disagree on this one. Your latest posintg is full of hyperbole and inaccuracies. Please note AdobeRGB IS NOT JUST 8-BIT. Ready the spec; 8-bit, 16-bit (both integer) and 32-bit (floating point).
http://www.color.org/chardata/rgb/adobergb.xalter
Your comments on commercial printers is also not quite accurate; Nick has found one that does do AdobeRGB. My bottom of the line pro (Epson Stylus Pro 3880) handles AdobeRGB quite nicely and while it is not blazingly fast (I'm not trying to crank out a print every couple of seconds), the difference in colours is noticeable when I print an sRGB file versus an AdobeRGB file, especially when the colours are out of gamut in sRGB (as per the example I posted previously on this thread). Thse commercial printing labs are expensive to buy because of their speed and automation, not just because of the colour accuracy of their output.
I'm not the only one to claim that there is more colour depth in an AdobeRGB screen; Richard Lundberg has found the same as I have according to the posting on this thread. While both you and I have some questions about dithering to create 2 additional bits of data, I have no questions at all that it is better than what a pure 8-bit can display. That being said, the company that makes my screen, Dell is headquartered in the USA, in perhaps the most litigeous country on the planet. While you and I may have some questions about how well dithering works, the fact that there hasn't been a lawsuit claiming false advertising suggests that the claims of AdobeRGB compliance are viewed by the company as being defensible legally, and to some extent this assures me that these claims are more than just marketing points.
As I said in the post it can be any bit depth and even floating point colour in practice. Where I mentioned 24bit aRGB I also mentioned IMAGES THAT CAN BE POSTED ON THE WEB so while people can view 30 bit colour on a monitor that's about it.
EG from the wiki
There are a number of other grey areas. How does a printer gamut relate to the colour gamut used? When soft proofing aRGB is 30 or 24bit been used? Does the printer accept 10bit and if so does it use the profile to convert to 24 or even some other number. The way the gamuts are specified any bit depth could be used even moving from say 24 bit to 30 bit or even 32 bit floating point.Color profile[edit]
Many JPEG files embed an ICC color profile (color space). Commonly used color profiles include sRGB and Adobe RGB. Because these color spaces use a non-linear transformation, the dynamic range of an 8-bit JPEG file is about 11 stops; see gamma curve.
Another example - some one linked to a ViewSonic monitor that accepts HDMI 36bit colour but displays in dithered 30bit colour. This 36bit HDMI could be aRGB or sRGB or something else.
Yet another - what happens when 48bit colour in tiff or png's are displayed on a monitor with a lesser colour space? That one is pretty obvious which is why I mention it.
John
-
Last edited by Donald; 12th August 2014 at 02:32 PM. Reason: Inserted 'open quote' tag
Only grey areas if you want them to be; but I am trying to provide Nick with answers so that he can figure out what is right for him.
1. An AdobeRGB screen is going to be a better choice for editing, as you can always resample to an sRGB image. If you have an sRGB screen you can never view the full range of colours that an AdobeRGB display can deliver. Somehow my screen drivers have always found a way of displaying things, regardless of the numerous variables involved.
2. Safest to downsample to sRGB when displaying files on the internet.
3. If you print your own, and let your software driver take care of the issues, the prints will come out fine. Unless you have a commercial printer that tells you that they can handle AdobeRGB, assume sRGB. Frankly, I've only ever printed directly out of Photoshop and did not convert to any other format; and somehow the printer managed the ProPhoto file I was using. I've never printed a TIFF or png. Commercial printing, I've never used anything other than jpegs.
Deliberately obfuscating things does not help Nick make an informed decision, which after all, the reason he started this thread.
Afraid I feel you are miss leading Nick Manfred where as I am trying to clear things up rather than hiding them. The only interest I would currently have in aRGB is if I wanted to buy and use a higher end printer. If some one wants their images to appear as they want on an sRGB monitor they need to pp for that gamut. The colour management in browsers can not do anything with colours that are out of the gamut being displayed, bang goes the extra colours. Have it both ways if you like but there is no way a monitor can display colours that are out of it's gamut.
Perhaps there is a jpg format that allows 30bit colour images to be passed around on the web and view on a screen but I am not aware of it.
Printing - entirely different matter especially diy. I thought that i had made that clear. That to my mind is the only reason for using aRGB and also means preparing sRGB images for the web with just as much care - unless of course some one doesn't care about that aspect.
John
-
Thank you all for your attention to helping inform me about these details in choosing monitors. It is great to have direct answers to particular questions which are hard to find via articles, and to get real users' experience on such matters.
If you shoot RAW, open your images in 16 bit Pro Photo on a wide gamut monitor, do your edits, print from Photoshop using Photoshop manages colors with the appropriate printer settings and use perceptual rendering.
Most quoted number are a crock anyway, but to answer the question, no, higher isn't better. Ultimate case-in-point - the "ducks nuts" of monitors for post-processing are made by a company called Eizo - their top of the line units cost many thousands - and last time I checked, their contrast ratio on their top of the line model was ...
... 800:1
I think you'll find that Relative Colorimetric rendering is used 90% of the time. Generally it's better to possibly have a few - generally unnoticeable - colours clipped to the nearest reproducible gamut than to have the colour of the entire image shifted to accommodate what are generally only a few OOG colours.
Google "Duck Dynasty". Unfortunately it's an example of the modern day version of living the American Dream.
Thanks for the answer about the contrast ratio.