Search results
Results From The WOW.Com Content Network
An object's surface brightness is its brightness per unit solid angle as seen in projection on the sky, and measurement of surface brightness is known as surface photometry. [9] A common application would be measurement of a galaxy's surface brightness profile, meaning its surface brightness as a function of distance from the galaxy's center.
While apparent magnitude is a measure of the brightness of an object as seen by a particular observer, absolute magnitude is a measure of the intrinsic brightness of an object. Flux decreases with distance according to an inverse-square law , so the apparent magnitude of a star depends on both its absolute brightness and its distance (and any ...
Prior to photographic methods to determine magnitude, the brightness of celestial objects was determined by visual photometric methods.This was simply achieved with the human eye by compared the brightness of an astronomical object with other nearby objects of known or fixed magnitude: especially regarding stars, planets and other planetary objects in the Solar System, variable stars [1] and ...
The Pogson logarithmic scale is used to measure both apparent and absolute magnitudes, the latter corresponding to the brightness of a star or other celestial body as seen if it would be located at an interstellar distance of 10 parsecs (3.1 × 10 17 metres). In addition to this brightness decrease from increased distance, there is an extra ...
Consequently, a magnitude 1 star is about 2.5 times brighter than a magnitude 2 star, about 2.5 2 times brighter than a magnitude 3 star, about 2.5 3 times brighter than a magnitude 4 star, and so on. This is the modern magnitude system, which measures the brightness, not the apparent size, of stars.
In astronomy, a phase curve describes the brightness of a reflecting body as a function of its phase angle (the arc subtended by the observer and the Sun as measured at the body). The brightness usually refers the object's absolute magnitude, which, in turn, is its apparent magnitude at a distance of one astronomical unit from the Earth and Sun.
The Greek astronomer Hipparchus established a numerical scale to describe the brightness of each star appearing in the sky. The brightest stars in the sky were assigned an apparent magnitude m = 1, and the dimmest stars visible to the naked eye are assigned m = 6. [7] The difference between them corresponds to a factor of 100 in brightness.
A truly dark sky has a surface brightness of 2 × 10 −4 cd m −2 or 21.8 mag arcsec −2. [9] [clarification needed] The peak surface brightness of the central region of the Orion Nebula is about 17 Mag/arcsec 2 (about 14 milli nits) and the outer bluish glow has a peak surface brightness of 21.3 Mag/arcsec 2 (about 0.27 millinits). [10]