How do you read a magnitude scale? One magnitude = 2.512 times brighter
So a 1st-magnitude star is 100 times brighter than a 6th-magnitude star. Or, conversely, a 6th-magnitude star is 100 times dimmer than a 1st-magnitude star. So a difference of 1 magnitude corresponds to a brightness factor of about 2.512 times.
How much brighter is a 3 magnitude star than a 4 magnitude star?
Comparing the magnitudes of different objects.
|Apparent magnitude difference (m2 – m1)||Ratio of apparent brightness (b1/b2)|
|2||(2.512)2 = 6.31|
|3||(2.512)3 = 15.85|
|4||(2.512)4 = 39.82|
Who invented the magnitude scale astronomy? The idea of a magnitude scale dates back to Hipparchus (around 150 BC) who invented a scale to describe the brightness of the stars he could see. He assigned an apparent magnitude of 1 to the brightest stars in the sky, and he gave the dimmest stars he could see an apparent magnitude of 6.
Why is the magnitude scale backwards? Note. The magnitude scale runs backwards to what you might expect: brighter stars have smaller magnitudes than fainter stars). If you do not understand the math, this just says that the magnitude of a given star (m) is different from that of some standard star (m0) by 2.5 times the logarithm of their flux ratio.