yup i have a nikon, and yup it works fine at F8 and it shoulnt... so nope no roll of tape required....
yup i have a nikon, and yup it works fine at F8 and it shoulnt... so nope no roll of tape required....
Thank you.
More similar data about Nikon AF limitations.
WW
The concept of "phase detection" has long been a pet peeve of mine, and I think that we may change it with very little fuss.
As pointed out it does not detect phase, has nothing whatsoever to do with phase, and mentioning phase in the same sentence is just as misleading as thinking that it depends on the Moon phase.
Phase has a well established definition in optical theory. Optical theory has three branches, wave theory, particle theory and ray theory. The phase of an electromagnetic wave governs some properties that are important to us, but sometimes, as for example in the realm of focusing systems, it is completely irrelevant. When finding out the size of the airy disk and interference patterns, phase is a crucial factor, and phase is also responsible for the formation of newton rings or the colours of the feathers of mallards.
But a slight change in the acronym definition makes everything crystal clear, and I think we could change it.
PDAF, the focusing method in many cameras should NOT mean "Phase Detection Auto Focus", it is a much simpler concept, Parallax Detection Auto Focus. The working method of triangulation uses parallax to find out the angle from the target that the triangulation base represents. It has been used for longer than any of us can remember, and we need not change the acronym, only the explanation, from a severely erroneous one to a technically correct one.
So the PDAF acronym can remain, but it should not be confused with phase detection, which in fact is a possible error when the system experiences a repetitive pattern and cannot find out the exact coinciding points, but finds wrong focus because instead of employing parallax, it detects phase in the pattern. When pointed at a brick wall or a set of tiles, it will not know which one is which and may superimpose the wrong ones upon another. By using phase, it can fail, as what should always be used by the PDAF system is not phase, but parallax.
Last edited by Inkanyezi; 31st March 2016 at 10:23 AM.
Urban - while what you have written is quite correct regarding the mechanics of how this form of focusing / autofocusing works, my electrical engineering friends tell me that "phase detect" is a good way of describing how the electronic side works in determining whether or not the lens has hit focus.
When the signals from each of the offset detectors do not match, the signals are out of phase. When they do match, they are in-phase (and the lens has been focused), so there is some definite logic in the nomenclature. That being said, I do agree with you, as it really does not do a good job in explaining how the measurements are actually made. Obfuscation seems to be the intent here.
I think it should be pointed out, that the AF data that is sorted out by the program in the camera, they are not simply signals, which could be in phase or out of phase, but they are images, which are compared by image analysis software. What we are doing is triangulation, just as in artillery distance instruments, although the base is very short, mostly about one centimetre only. The two images are compared, and the system moves the lens so that image elements coincide. In this process it is important that the very same image elements from each image should coincide, which might not be the case if the system would only detect phase, as when for example focusing on tilework with uniformly repeating pattern. In that case it may aim for different points that give the same response, and the image would be out of focus. Therefore, phase is misleading. The system should superimpose the same image elements over one another, in order to find focus.
As pointed out in a former post in the thread, a lens cap with two holes in it, at equal distance from the centre, will display the same type of image in the viewfinder as in a rangefinder camera, two images that are not coinciding when not in focus. I have done such a demonstration which I will display here. First the image of the lens cap. Note that the lens used must have large enough entrance pupil to accomodate both holes and create images from them. It could also be noted that the patches on the entrance pupil of a DSLR camera with PDAF are about the same size as these holes in the lens cap, corresponding to about f/22, which is the reason that it is very difficult to find focus in dim light.
The setup to demonstrate the effect of looking through the viewfinder is the one below. A few objects are spread over a surface at different distances from the camera, and focus is set on one of them. The object in focus will appear as single, while those out of focus will be double-exposed. In the AF system they are not mixed as in the image here, but individually analysed, in order to move the lens into focus so that the two images of the object in the focus point coincide.
And the image captured shows double exposure of unfocused objects, while the focused one appears as if it were not superimposed.
Last edited by Inkanyezi; 31st March 2016 at 04:39 PM.
Digital cameras process DATA not IMAGES. The data is turned into images. In phase detect (unlike contrast detect), the sensor is not involved in the focusing operation.
A good explanation of phase detect can be found here and it really does NOT work with images, but rather with electronic data:
http://graphics.stanford.edu/courses...tofocusPD.html
There is no contradiction between the Stanford papers and my account, and we all know that computers process data. But the AF sensors of DSLR cameras indeed produce images, although only thin strips. The applet in the link presents the matter in a confusing way, because the two sensors are not offset, but interleaved in the same position, and the microlenses are indeed microlenses, not as large as in the sketch, but one microlens for each individual sensel in the AF sensor strip. They are rendered in offset positions only for clarity. The directional elements that point each AF sensel to the same spot on the main lens of the camera are microlenses, one in front of each AF sensel. There are microlenses pointing to the left spot and other microlenses pointing to the right spot. Both sets of microlenses/sensels are effectively in the same position, separated only by one sensel width and interleaved, and the AF sensors produce images that are analysed by image processing software. When they match, the object is in focus, but the software can calculate both direction and distance to move the lens in order to find focus for mismatching images.
I didn't intend my account to be misunderstood. I never proposed that the image sensor would be used for focusing in a DSLR, but I did point out that it is the way the AF sensors work.
Last edited by Inkanyezi; 31st March 2016 at 07:55 PM.
Urban - we are having a silly argument about what something is called, no more, no less.
I think we both understand that, much like the old rangefinder cameras of the 1930's to current models, which measure parallax and use trigonometry to determine the distance to the subject. This is the underlying principle used in an autofocus technique that is commonly referred to as "phase detect".
You have stated that you think that this description is not accurate and something like "parallax detection" would be a more precise way of explaining it. I would tend to agree with that, but would also suggest calling phase detect (from a signal processing rather than optics viewpoint) is also correct (and I would agree, the terminology is rather obtuse). The autofocus mechanism would be largely designed by electrical engineers, not optical or mechanical designers, so it is not surprising that electrical engineering jargon would go into the naming convention.
Am I sitting on a fence?
I have to agree with Manfred that you are just arguing over terminology used. However even electrical engineers are trained about phase in terms of wave forms or cyclic events. Which is not really the case in how this type of autofocus is achieved. Waves are dynamic and this auto focus system is evaluating an essentially static comparison of the position of two light paths.
Poor choice of naming at the very beginning. (Bright idea who cares what it is called).....
P.S. Maybe it should have been called ADAF Alignment Detect Auto Focus.
Last edited by pnodrog; 31st March 2016 at 10:11 PM.
Yes I think Urban is viewing the term "phase" in the traditional sense which applies to time variant signals (such as EM light waves), and this has nothing to do with the optical process in AF. When the AF system compares the two outputs from the sensors, it is looking at spatial variants rather than time variants. As suggested in the Stanford article, the comparison is probably using a form of digital signal processing called cross-correlation which determines the spatial displacement (call it phase if you like!) between the two sensor signals.
Actually there is a huge overlap between digital signal processing in the time domain (as used in communications) and DSP used in the spatial domain as used in digital imaging. One significant difference is that digital imaging is 2D whereas for time variant signals it is usually 1D.
Dave
You are right (possibly sort of right). I see to remember seeing or reading somewhere that this technology started out in motion picture gear and if this is true, then the situation would have been dynamic as frame rate would have come into play. The logic sort of makes sense as putting AF or IS into gear that runs at an order of magnitude or two more money than even a high end still camera.
Yes, even after all these years I remember looking at phase diagrams in Chemistry 110 and then of course in metallurgy. Phase change (solid, liquid and gas, especially the latter two) were a mainstay in thermodynamics (steam tables, behaviours of non-ideal gases, etc.). Then of course, the electrical engineer profs (who all seemed to have a heavy accent), totally got things confused with 3-phase electrical supply for electrical motors, power factors, etc. One really has to be really careful regarding which field of science or engineering one was working in order to not confuse the issue too much.
The of course, there was that part about having to be able to work in both British Engineering and SI units of measure (I was in the last class that had to work in both, the year after mine was 100% SI).
I wouldn't worry about it too much. It is just a phase, and it will pass soon enough.
(My bolding)
Well, the issue could of course be confused by the various usages of the word phase. Among hackers the expression phase of the moon is common for things that are difficult to explain for one reason or another, more likely that one does not know why.
And it is not such a thing as a silly argument over terminology, but fact is that current terminology occults the nature of how the system works. You yourself tried to present it as if it were a simple signal processing issue, which is given the name because it is in the realm of electrical engineering, and now the poo is scratched over with silliness.
What I actually tried to do was explain the workings of PDAF in a pedagogical way, which does not occult its nature, but reveals it. Never did I hear in the age of rangefinder cameras or when talking about the coincidence found in the viewfinder as "phase". This erroneous term was introduced with AF in DSLR cameras.
And it is indeed erroneous, because it has nothing at all to do with phase as used in other contexts, much less so in the realm of optics. The AF system is indeed an optical one. Whether the images are processed by the human vision system or a computer is moot. The system is conceptually identical with the optical rangefinder.
So if we are careful regarding the field of science as well as engineering, I think we should not use the term phase for what is not phase in those fields.
Last edited by Inkanyezi; 1st April 2016 at 05:15 AM.
Unfortunately, I suspect that you have lost the battle as common use of this technique (the combination of classical rangefinder and electronic interpretation of the data collected by the detector); commonly known as "phase detect". There are many things out there with illogical names, if we try to break them down into scientific / engineering / linguistic terms. I suspect that they exist in every language and sometimes we probably wonder why we ever started calling things by certain names. but today, we just accept them because they have come into common use.
If someone refers to a rangefinder, I will think of a Leica M series camera, both film and digital, as these truly use a classical mechanical rangefinder mechanism. That is something I have always known from some of the earliest days I can remember my father taking pictures with a rangefinder camera. If I were not familiar as to how this technoloig works, I probably would not think of this being how I would describe the autofocus mechanism in a DSLR (even though you have eloquently explained that this is the case).
Your PDAF, I can also understand, if someone explained it to me. If I start suggesting that my cameras use that to focus automatically, I'm sure I would get strange looks from knowledgeable people. I'm sure someone would correct me and tell me that the proper name is "phase detect".
And while we are at it, we should jump all over the term "contrast detect" autofocus as well, as we could probably say that this is quite misleading. It determines correct focus when contrast is maximized, not when it is detected. I'm sure we could come up with a far better name for that technology as well.
Time to shut down this rather ancient thread. I think we have gone on long enough....