Was lying in bed this morning thinking about perception of color, light and white balance... Everything in our physical reality has 2 colors; actual color and percieved color. The camera sees actual color and the eye sees percieved color. The color of anything is generally a combination of 2 things; the surface and the light hitting that surface. Each plays a role in the color of that object. Since the eye changes the color of what we see to something besides the actual color we need a way for the images our camera captures to also be changed to match as closely as possible the percieved colors in our photo. But why? If the photo is an exact replica of the scenes actual colors then won't our eyes play the same trick on us when we view it as it did when we looked at the scene and adjust accordingly making the photo look the same? Obviously this doesn't happen, but why not?