##### Magnitude scale theory

How bright a star looks is given by its apparent magnitude.
The absolute magnitude of a star is defined as the apparent magnitude that it would have if placed at a distance of 10 parsecs from the Earth.

Consider two stars where star A appears to be brighter than star B.

Let the apparent magnitude of star A = m_{A} and the apparent magnitude of star B be m_{B}.

IA/IB = 100(m_{B} – m_{A})/5

Therefore taking logs of both sides : m_{B} – m_{A} = 5/2[lg(I_{A}/I_{B})]

Now let the magnitude of A (m_{A}) be that at 10 parsecs, in other words the absolute magnitude of the star (M) and let m_{B} be the magnitude (m) at some other distance d (also measured in parsecs).

Therefore: lots m – M = 5/2[lg(I_{A}/I_{B})] lots of space