M–sigma relation
From Wikipedia, the free encyclopedia
The M–sigma (or M–σ) relation is an empirical correlation between the stellar velocity dispersion σ of a galaxy bulge and the mass M of the supermassive black hole at its center.
The M–σ relation was first presented in 1999 during a conference at the Institut d'Astrophysique de Paris in France. The proposed form of the relation, which was called the "Faber–Jackson law for black holes", was[1]
where is the solar mass. Publication of the relation in a refereed journal, by two groups, took place the following year.[2] [3] One of many recent studies,[4][5] based on the growing sample of published black hole masses in nearby galaxies, gives[6]
Earlier work demonstrated a relationship between galaxy luminosity and black hole mass,[7] which nowadays has a comparable level of scatter.[8][9] The M–σ relation is generally interpreted as implying some source of mechanical feedback between the growth of supermassive black holes and the growth of galaxy bulges, although the source of this feedback is still uncertain.
Discovery of the M–σ relation was taken by many astronomers to imply that supermassive black holes are fundamental components of galaxies. Prior to about 2000, the main concern had been the simple detection of black holes, while afterward the interest changed to understanding the role of supermassive black holes as a critical component of galaxies. This led to the main uses of the relation to estimate black hole masses in galaxies that are too distant for direct mass measurements to be made, and to assay the overall black hole content of the Universe.