Howdy folks, Is there a difference between PPI & DPI? If so how does that affect the quality of a print or the size of the images? Thanks in advance.
Howdy folks, Is there a difference between PPI & DPI? If so how does that affect the quality of a print or the size of the images? Thanks in advance.
They are very often, incorrectly, used do describe the same thing.
ppi = Pixels Per Inch is a variable number and a way of showing the number of pixel a digital file will have at a given print size.
e.g. a 3000x6000 (18 million pixels) file sized at 300ppi will print 10x20'" inches. If you printed to print this at 20x40" then the density would be 150ppi, 600ppi when printed at 5x10"....and so forth
dpi = Dots Per Inch is a fixed number as it is the physical number of dots used to make up an image by a printer. Inkjet printers will often have thousands of these and is one means of measuring the quality of the printer itself.
Not the only one mind: Printer a. might have 9600x2400dpi but only use 3 colours and a black whereas printer b. may could have 4800x1200dpi but use 6 colours a grey and a black cat ridge giving more realistic colours.
Have a look at this thread
DPI vs. PPI
It depends on the technology you're referring to.
As already stated, PPI refers to the number of image pixels (which can also be described as a dot) per inch. PPI is used when describing the physical rendition of an image on a device. That is...when we want to describe how much physical space is being used by an image, be it on a monitor or printed paper, we say that the image is being displayed or printed at 100, 300, or even 537.25 PPI.
DPI has two meanings. In relation to printers, DPI defines the number of physical dots of ink that the printer can produce in an inch. Within this meaning, there are two ways to view the dots. Some printers, typically large production "wet" printers, will have dots that match image pixels one-to-one. That is, one printer dot equals one image pixel. For example, a Durst Theta 76 printer has a DPI rating of 254 DPI (and it produces great prints.) Other printers, such as typical inkjets, will use multiple dots to recreate a single pixel from an image. Such printers will have an unqualified rating of 2400x9600 DPI (for example) but will typically also list a "black" rating (such as 600x600 DPI.) The black rating is usually the best one-to-one (one image pixel to one printer "dot") resolution you can get for color images as well. And again, it's because multiple printer dots are required to create a single pixel from an image.
The second meaning of DPI is in relation to scanners. When a scanner scans a document it does so at a certain rate, such as 300 or 600 dots per inch (or higher.)
Image files have a parameter called DPI. Different software packages give this parameter different names, such as "resolution", or PPI or DPI. The correct label is DPI. This original purpose of this parameter is to store the scanning resolution. So if a document is scanned at 600 DPI, then the DPI parameter is set to "600" by the scanning software. This allows a document to be printed at its exact size in the future, recreating the document.
When you capture an image, the camera may set the DPI value to 72, 100, 200, or even 300. In truth, it doesn't matter what it is. The camera is just entering in a default value so that there's something there. On an image from a camera, the DPI parameter is meaningless. All that matters is the actual X:Y values of the image.
It's also very important noting than DPI and PPI are both units, not dimensions nor quantities. The quantities they are usually used to measure are spatial printing dot density and spatial sampling density, both are units of inverse length but are by no means the only valid ones. Lines per milimeter are of the same dimension and are in common usage for MTF. The association of a particular unit with a quantity is merely a custom and in these cases has no particular physical basis.