Magnitude scale theory
How bright a star looks is given by its apparent magnitude.
The absolute magnitude of a star is defined as the apparent magnitude that it would have if placed at a distance of 10 parsecs from the Earth.
Consider two stars where star A appears to be brighter than star B.
Let the apparent magnitude of star A = mA and the apparent magnitude of star B be mB.
IA/IB = 100(mB – mA)/5
Therefore taking logs of both sides : mB – mA = 5/2[lg(IA/IB)]
Now let the magnitude of A (mA) be that at 10 parsecs, in other words the absolute magnitude of the star (M) and let mB be the magnitude (m) at some other distance d (also measured in parsecs).
Therefore: lots m – M = 5/2[lg(IA/IB)] lots of space