No. As a matter of fact I just called and chatted with them. They're opinion is that 300 is the industry standard and that's what they print at and they don't think it makes any difference whatsoever. Perhaps if you were using a very special paper, but they said they do all their printing at 300. I just made a print of one of the images I'd done at 300 at 360 and I can't see ANY difference. I'm sticking with my settings.
Thanks to all for the input.
and by doing so you may save some ink
One can also change the number of pixels in a file by upsampling or downsampling and saving it. I understand where you are coming from, but frankly I know of very few instances where I view ALL the data in an image file. When I view it on my screen, the image is downsampled and for the most part, when I do a print, it is upsampled. The original pixels the camera has recorded are totally irrelevant other than when we have taken the shot and imported them to our camera for PP work.
Please re-read what I have written in #10. Think of it this way:
1. With very few exceptions (Sigma and Leica Monochrom cameras), a real world scene is recorded as RGGB data.
2. This data is converted into an image file with each data point being represented by a single RGB value.
3. The computer's display driver directs a single RGB value to three discrete display elements to display the value.
4. The printer uses a variant of the CMYK process to produce a print. It takes a minimum of four discrete dots to "emulate" the single RGB element of an image file. More advanced colour printers use additional colours, with the example of the Epson Stylus Pro 3880, that I use. it uses 8 different inks to represent a single RGB value. Other more advanced inkjet printers can go to 10 (and possibly more) individual inks to create a single colour value. Likewise, advanced inkjet printers can vary the size of the ink dot that they deposit (full size dots and half size on advanced printers), so there is no "standard" that printers work do. Each manufacturer and each specific printer can perform differently. Printer resolution can be varied as well. My Epson 3880 can print at either 2880dpi (high resolution) or 1440 dpi (low resolution). It takes 8 dithered ink dots to create a single "colour" on my printer, just as it takes 3 "dots" to create a single colour on my computer display. While the printer has a resolution of 2880 dpi, it takes 8 ink depositions to make a colour, so I get a total of 360 distinct colours per linear inch of paper. As Ted has put is so clearly, the dpi seems to have taken on two different meanings, and in my view, this certainly adds to the confusion.
I'm not 100% sure either, but the resolution in the direction of the nozzle travel is twice as high as in the direction of the paper travel, so the ink seems to be deposited in an oblong pattern. The 3880 printer specs suggest a high resolution of 2880 x 1440 and a normal resolution of 1440 x 720. There is some dithering involved and there is also some "bleed" when the ink is deposited, and that seems to vary by paper type; more on matte papers than on glossy papers.
When printing, size is the key factor as you are looking at, based on the paper size of your final output. The image must be scaled to the appropriate size to fit on the print medium and this will almost never be the native resolution of the camera output.
This means the photographer needs to resize the image to fit on the paper. There are two factors in play here; the width and lengths of the image file that has been produced will rarely have the same ratio as the width and length of the paper the image is being printed on, so some adjustment (usually cropping) will have to be done for the fit. The second piece is scaling the image to fit the print size. Depending on the camera that was used and the print size, the image will have to be either upsampled or downsampled to fit. The printer driver can handle both of these operations, but the result will probably not be optimal, so we tend to do this using our PP tool of choice.
The other issue is that the printer does have a native mode, as was previously discussed. If the final image is not at the correct resolution for the specific printer, the driver will have to interpolate to bring the image to the correct resolution. This is of course, on top of any resizing done in post. Again, in theory, the printer should be able to produce a sharper image if the image is set to the native resolution, but in practice, I find that the print drivers are extremely good, so I am not sure anyone is going to notice the difference between a 300 dpi or 360 dpi resolution in the final print.
My photoshop settings are Adobe 1998 and I let photoshop manage my color using the ICC profile of the paper I'm using, usually Moab Juniper Baryta
You might wish to test working in the ProPhoto colour space and printing directly from the Photoshop files, especially if you have vivid colours in your shot. You might find the results pleasantly surprising. The main "problem" is that there can be surprises, even if you use a high end AdobeRGB compatible display, as your printer's gamut will be wider than your computer screen's gamut.
Dem's remark on printing the original size made me playing with the printer. I just used IView and a Brother laserprinter. I believe it's about equal to all printers.
Firstly, a printer can only print on his own settings. This printer can change between 300 and 600 dpi(?).
The image I took was a jpg. I don't know whether it was created by CaptureNx or taken out off the NEF by IView. Anyway, the so called resolution was not filled, blank.
Opening the image and the printing procedure gave me the change to choice between several options among which a)original size(from image dpi) and b)best fit to page(aspect ratio). The last one is the commonly used to print pictures.
a) When printing the image original size with the dpi setting blank, gave me blank pages. Changing the dpi in IView, easy done, the print format changed. The printer driver is correcting the image size with a factor printer-resolution/image-resolution. That's new to me.
b)The image dpi settings are ignored. Just the printer setting and the wanted size in mm.
a) is assuming you didn't resize the image physical. Resampling?
I learned something.
George
That is definitely not a correct assumption. Laser printers and ink jet printers use a totally different technology and cannot be compared directly
Here are two screen shots of the Epson Stylus Pro 3880, showing the two quality levels it can use. The maximum quality setting is the upper limit, but given that dot size can be varied, high end photo inkjet printers can give different quality outputs. Likewise, I can set the print speed (faster = lower quality) that the print heads move and I believe I can also set uni-directional versus bi-directional printing for the print heads.
1. Best Quality setting
2. Good quality setting
My thought exactly. I don't see any advantage in using a smaller working color space if you intend to print. If you intend to display on a computer and have a wide-gamut monitor, it's another issue. Assuming you have your monitor properly calibrated, the software will convert from the working space to the space of your monitor as you work. The only surprise arises if your output device uses a different space than your monitor. One such case, as Manfred pointed out, is printing, so you may get surprises whatever you do when you print. The other case is if you have a wide-gamut monitor and are preparing images for display on the web, which will mostly be seen by people with sRGB monitors. In that case, I would want to reduce the image to sRGB before posting to see what others will see and possibly to make compensating edits.
I'm not discussing the technology. Just how one comes to a result. The interface for my laser printer and the Photosmart from HP are the same. Only when going to the properties of the printer, they're different. In both I can set different options among quality.
To me the basic idea is one pixel is one pixel, either on the screen or on the paper. And when it's not a 1 to 1 situation, a recalculation has to be done before showing or printing.
This discussion was first about the dpi setting in the image. Now I know they're used to correct the calculations the printer is doing when printing "real size". It's not exactly real size but wanted metric size.
You and others too, must distinguish between the image as a file and the printer as a peace of hardware.
Here are two screen shots of the Epson Stylus Pro 3880, showing the two quality levels it can use. The maximum quality setting is the upper limit, but given that dot size can be varied, high end photo inkjet printers can give different quality outputs. Likewise, I can set the print speed (faster = lower quality) that the print heads move and I believe I can also set uni-directional versus bi-directional printing for the print heads.
1. Best Quality setting
2. Good quality setting
George
DPI is not a measure of hardware, although a given printer may work best at a given DPI. DPI is simply a scaling factor, mapping the metric used to describe files (a simple count, number of pixels, which has information about physical size) to a the physical dimensions of a print. pixels/DPI=length of print in inches.You and others too, must distinguish between the image as a file and the printer as a peace of hardware.
However, that wasn't the original question:
I am not an expert on print technology, but I think the answer for inkjet printers is "yes, there will be a difference, but not always enough that you will notice."Will I notice a difference if I print at 360?
Is that last generalization correct?
George - I think you often get hung up on the file itself, which is nothing other than the medium used to craft the image. It's a bit like Michelangelo getting hung up on the piece of marble he used to carve the statue of David. The file is a means to an end, rather than an end in itself.
The only thing that really counts is what you do with the file, so your skills with the editing tool and if you are printing, the capabilities of the printer itself are probably more important than the image file.
DPI is what the printer is able to do. It's a hardware setting. And that knowledge is used by us to calculate the final dimension under that circumstance. That you can play in the calculator with that value doesn't mean the printer can. It can't.
My reaction was on the introduction to the questionGeorgeMost of my files are at 300dpi and I have printed a few of these with the new printer. They seem fine. Will I notice a difference if I print at 360? I really don't want to have to convert all my files (at least those I'm going to print)
That is correct. I recently had to prepare some artwork for use on a 150 dpi printer. Some of the decals that we see on commercial vehicles are printed at 75 dpi. These are limitations of the hardware and the dpi drives the scaling of the image. An image done on a high end 360 dpi Epson photo printer might have to be upsampled to make the final print. The same image going one the side of a truck using UV resistant inks on a 75 dpi print might have to be downsampled. Both of these requirements will affect how the image is prepared for final print output.
The generalization is correct Dan. The differences might be negligible for photographs, but this is not necessarily true for other types of images. If you print a vector graphics image (i.e. done with a tool like Adobe Illustrator), the edges of the graphics are quite hard and one can see the differences in resolution between the quality modes. These printers are not just used for reproducing photographs.
Sorry, Manfred, I can't help but disagree with the analogy. The piece of marble is more analogous to the the sensor prior to exposure, IMHO.
Assuming that, by "the file" you mean the raw image data therein, do we not strive to make that data as "good" as possible rather than relying on PP to make it good. Perhaps I misunderstand but that's how it reads . . .The file is a means to an end, rather than an end in itself.
Got to disagree with those too, sorry again. Perhaps I'm old-fashioned in preferring a capture that needs a minimum of PP.The only thing that really counts is what you do with the file, so your skills with the editing tool and if you are printing, the capabilities of the printer itself are probably more important than the image file.
You can use any analogy you like Ted, but in this case, I prefer mine to yours. The sensor (and camera), lens, etc. are all tools in my view, and the file that my camera has captured is my raw material to craft an image. Michelangelo's piece of marble gets a lot more credit than the hammers and chisels he used to carve with.
No Ted, I meant file. The data is not accessible to me in any other way, unless I want to run around looking at the image on my camera screen. The file I download from my camera is what I start with when I prepare my images.
I also try to shoot that way, but the image created from raw data does need a bit of sharpening and contrast adjustment at a minimum. I also do a bit of straightening and cropping, when required, especially when printing because my camera's sensor height to width ratio is not the same as any commercial paper sizes. I will also do a bit of dodging and burning; to quote Ansel Adams "“Dodging and burning are steps to take care of mistakes God made in establishing tonal relationships.” Usually that is all I do with an image.
Dan, what you described is PPI (a software setting in an image file*). A DPI is a hardware property of the printer - if the printer is set to print at say 360 dpi, it will always print at 360 dpi whatever file we throw at it.
Manfred made the same mistake at the beginning of this thread and used term DPI when he meant PPI till he showed a screenshot saying "Resolution: 360 Pixel Per Inch" as an image property.
* Computer screens and camera sensors have pixels, so it is PPI for them. Printers don't have pixels, they produce dots on paper, so it is DPI for them.
People in Adobe understand this difference and use "pixel per inch" in image settings. Unfortunately, people in Microsoft and Yahoo don't care about this difference, that's why "File properties" in Windows Explorer and "EXIF" on flickr both say "dpi" instead of "ppi". The fact is, there is neither ppi nor dpi supplied in the EXIF. There are two numbers "X Resolution", "Y resolution" (usually identical) and "Resolution Unit" (usually set to "inch" or "inches"). Then it is the responsibility of the EXIF viewer how to interpret and display these numbers - whether to show them as two numbers or as one, whether to put "dpi" after the number, "ppi" or just leave it there.
https://99designs.co.uk/blog/tips-en...he-difference/
Dem - you are correct in terms of that PPI is related to screen resolution and DPI is related printer resolution
However, there is a "special case" that when you set the image resolution to the same value as the printer's native resolution, the printer driver does not have to scale / interpolate the image and the print quality will be maximized. I (and others) tend to get a bit sloppy with our language at times and don't always go through this full explanation. If you are using a Canon or HP 300 dpi printer, you should be working your image at 300 ppi and for Epson 360 dpi printers you should be working at 360 ppi.