Spark image

The magnitudes of stars

The magnitude of a star is a measure of its brightness.
In the second century BC the Greek astronomer Hipparchus devised an approximate scale of stellar magnitudes.

Comparing the brightness of two stars he decided that if one star was 2.5 times brighter than the other the difference of magnitude between them was 1.

Two stars with a difference of 5 magnitudes would be 100 times brighter. The unaided human eye can just detect stars of magnitude six in good seeing conditions.
For example: 2.5x2.5x2.5x2.5x2.5 = 2.55 = 100. [In actual fact 2.5125 = 100]

A lower intensity means a greater positive number for magnitude. That means that a star of magnitude -1.0 is much brighter than a star with a magnitude of + 5.0. In fact a difference of magnitude of +5 means a decrease in intensity of 100 (by definition).

Apparent magnitude and absolute magnitude

How bright a star looks when viewed from the Earth is given by its apparent magnitude. However this does not give a true impression of the actual brightness of a star. A nearby faint star may well look brighter than another star that is actually brighter but more distant. (A good example of this is shown by Rigel and Sirius in the following table. Sirius looks brighter than Rigel when seen from the Earth but it is actually fainter but much closer.)

The actual brightness of a star is measured by its absolute magnitude. The absolute magnitude of a star is defined as the apparent magnitude that it would have if placed at a distance of 10 parsecs from the Earth.

The apparent and absolute magnitudes of a number of stars are given in the following table.

Object Apparent magnitude Absolute magnitude Distance (light years)
Sun -26.7 n/a n/a
Venus -4.4 n/a n/a
Jupiter -2.2 n/a n/a
Sirius -1.46 +1.4 8.7
Rigel -0.1 -7.0 880.0
Arcturus -0.1 -0.2 35.86
Proxima Centauri +10.7 +15.1 4.2
Vega 0.0 +0.5 26.4
Betelgeuse +0.4 -5.9 586
Deneb (a Cygni) +1.3 -7.2 1630
Andromeda galaxy 5 -17.9 2 200 000
Our galaxy n/a -18.0 n/a
© Keith Gibbs 2010