Glossary of Astronomical Terms/apparent magnitude

From testwiki
Jump to navigation Jump to search

Return to Contents

Apparent magnitude or apparent visual magnitude is the numerical value of light intensity that we give to stars as they appear (hence the word "apparent") on Earth. Stars were first assigned their apparent magnitude values by the Greek astronomer Hipparchus (160-127 B.C.). He cataloged the stars and assigned them values of 1 to the brightest stars and to dimmer stars he assigned higher numbers ending at 6, the faintest magnitude a human eye can detect. This fact that confuses students to this day, but is easier to remember when you realize that he was thinking that brighter stars we see had to be the most important stars, the next set of bright stars were of secondary importance and so on.

Some apparent magnitudes are: -26.8 for the Sun, -13 for the full moon, -1.47 for Sirius, 11.05 for Proxima Centauri. Apparent magnitude is related to absolute magnitude. Knowing both values allows you figure out the distance to the star.

Today we realize that this scale measures light's intensity logrithmically. Meaning that stars of magnitude 1.0 are 2.512 times more intense than stars of 2.0 magnitude. To figure out the intensity ratio values between two stars, use: Ia=2.512(mbma)Ib This equates intensity of star a to the intensity of star b times 2.512 to the difference of their magnitudes. Note how it reads that its the magnitude of star b minus the the magnitude of star a. If star a has a magnitude of 1.2 and star b has a magnitude of 5.2, then star a is 40 times brighter than star b.


Magnitude difference Intensity Multiple
0 1
1 2.5
2 6.3
3 16
4 40
5 100
6 250
10 10,000

Note that 2.5126=2.5121+5=2.5121*2.5125 so once you know the first five, you can figure out any other whole number differences.