There is probably a pretty simple explanation to this question. I have discussed it over the years and have not been able to solve this problem to my satisfaction. Maybe someone with a more scientific mind can help...
I have added numbers to the illustration in the very excellent tutorial on exposure and metering to illustrate what I am talking about...
If the distance between the light source and the subject (#1) is increased, the light intensity decreases to the square of the distance and the exposure must be increased to compensate for that decrease.
However, the distance between the subject and the camera (#2) is irrelevant for exposure. As an example: if distance #2 is one meter and the exposure for that subject is f/16 @ 1/100 second but, if distance #2 is increased to, say, 10 meters, the exposure will remain as f/16 @ 1/100 second.
Why doesn't the light intensity of the reflected light decrease as the #2 distance increases and why does the exposure remain the same.