Tuesday, 4 October 2011

observation - How can apparent magnitude be negative?

Apparent magnitude is measure of how bright an object appears to an observer on Earth, meaning it's a function of both the object's intrinsic luminosity and its distance from us. The concept of magnitudes dates back to the Ancient Greeks, when stars in the sky were categorized into six magnitudes (the brightest being 1 and the faintest being 6). Each successively lower magnitude was twice as bright as the one before, meaning the scale was logarithmic. We still use magnitudes for historical reasons, though the scale was later standardized to use the formula



$m_x - m_{x,0} = -2.5log_{10}(frac{F_x}{F_{x,0}})$



where $m_x$ and $F_x$ are the magnitude and flux of the object of interest and $m_{x,0}$ and $F_{x,0}$ are the magnitude of flux of a reference object (where usually Vega is used to define the 0 point in magnitude). This means any object that appears brighter than Vega has a negative magnitude. There is no limit to how bright an object can appear, so there is no lower limit to magnitudes. The sun, for example, being the brightest object in our sky, has a magnitude of roughly -27.

No comments:

Post a Comment