We talk about magnitudes frequently in this blog, so I thought it would be good to put that concept into perspective. The magnitude (M) of a celestial object is a measure of its brightness. Magnitude is measured on a logarithmic scale. So a difference of 1 magnitude equates (by definition) to a brightness ratio of 2.512. That is:
M1/M2 = 2.512(M1 – M2)
The range of magnitudes visible in current astronomy spans the range from M = -26 to M = +34 (if we include the James Webb Space Telescope scheduled to launch in 2018). That’s a brightness ratio of 2.51250 = 1×1020 (100 quintillion) between what we can see with our unaided eyes, and what state-of-the-art telescopes can see.
One might wonder why the magnitude scale includes both negative and positive values, and why lower numbers mean brighter objects. The star Vega was arbitrarily assigned magnitude zero by early astronomers. Vega is the 4th brightest star in the sky and was chosen as the reference standard. The faintest stars visible to the unaided eye were assigned magnitude +6.0 (another arbitrary choice). Using that historically defined scale, modern photometric measurements expanded the range to the current -26 to +34.
Magnitudes can be equated to an exact amount of light flux measured in units of energy, but that’s only important if you’re a professional astronomer. For our purposes, magnitude is simply a way to compare the brightness of two celestial objects.
You might want to check out my Dec 9, 2013 post for another take on magnitude comparisons. Likewise my older Feb 15, 2010 post about the number of stars visible to the unaided eye at various magnitudes.
Obviously, the unaided human eye is not as sensitive to light as modern telescopes with CCD imaging systems. Still, in a dark and clear sky, close to 5000 stars will be visible on any given night. Using even a cheap set of binoculars, you can double that number.
Next Week in Sky Lights ⇒ Santa’s Sleigh Sighted?