I'm a bit of a newcomer to digital photography but old enough to have done some photography using film. As I recall, in the case of film, ISO was a property of film that indicated the speed of the film. Slower film, with lower ISO value, was more fine grained which meant that resulting photos could be sharper whereas faster film, with higher ISO value, was courser meaning that pictures may not be as sharp as with with slower film. In that, there were trade-offs involved in choosing to shoot with faster film.
I've done some web browsing trying to learn how my digital camera works. This includes reading articles such as this one about sensors. I'm having a hard time grasping how the ISO concept, intended to adjust a camera for different films (i.e., sensors), applies to the sensor in my digital camera which is always the same. The instructions for using my camera as well as various articles on this subject suggest that the concept is the same. This sort of implies that the camera can, somehow, alter the sensor. However, some limited testing with my new camera does not seem to support the idea of such an analogy with film based photography. I accept that my experiments may not be sufficient to be conclusive but I'd very much like to learn what actually happens in the camera when ISO settings change. This especially involves knowing if the camera can alter the sensor and if so how? It also seems plausible that the camera can computationally alter the data captured by the sensor in a way that relates to the film speed concepts. Maybe there is another explanation or some combination of such methods.