I have to introduce the concept of magnitudes. Now first let's talk about rational things like fluxes. So you have a source of radiation, like thermal black body or synchron or whatever. And it's going to flux. Flux is a function of frequency. And if you integrate over all frequencies or all wavelengths, you'll get total luminosity. Now, that's what usually you measure how much flux you get at a given frequency or wavelength, whether it's radio or optical, it doesn't matter. And radio astronomers have introduced the unit of Jansky in honor of the first radio astronomer and it's value is given here. So first radio sources were measured in kilojanskys. Now people are talking about microjanskys or even nanojanskys. So technology marches forward. So this is flux density, specific flux per unit frequency or sometimes per unit wavelength. And to get the total power you have to integrate over the spectrum, but usually you don't integrate over the entire spectrum because you can't observe from zero wavelength to infinite wavelength. There is always some finite bandwidth over which you do that. In visible light it's different filters, in radio there is again bandwidth and so on. Now, we come to magnitudes. Sometimes I say that magnitudes are invented by astronomers to keep physicists out of the field. Because they're so utterly irrational. But the historical reason for them is ancient Greeks, being scientifically minded, quantified the brightness of stars. And they decided that with the naked eye, good vision, dark sight like a Greek island or something. Not too much wine. You can distinguish six different levels of brightness among the visible stars. And the very brightest they call zero magnitude or first magnitude. And the faintest worth six. Now it turns out the human eye is not a linear defector. It's actually a logarithmic defector. And the steps of one magnitude correspond not to a linear shift by a certain amount but by multiplicative shift. And one magnitude corresponds to roughly 2.5. It corresponds precisely to 10 to the 0.4 power. This was quantified in 19th century. So another way to look at it if you're minded like an engineer, a magnitude is minus four decibels. It's 10 to the minus 0.4 power. So it's a logarithmic scale, and that means there is a zero-point. And that zero-point is where things get really weird because in principle can be anything you want. So before we can do absolute calibrations of things, it was just decided as a convention that Vega, which is one of the brightest stars in the sky, will have magnitude of zero regardless of which filter you use. Now spectrum of Vega is not flat, not at all, but that was the convention. So the zero-point varies with wavelength. But to make life even harder, some other people introduce different zero-points, but let's not even go there. But at any rate, what happens is when you held a detector like CCD, taking a picture through a given filter like UV, or blue, or something. You're integrating flux over certain wavelengths. Measuring and that total amount, and take the log of that, multiply by minus 2.5, and add the zero-point, and you've got the magnitude. So it's upside down logarithmic scale with the weird unit, and even weirder zero-point. But a handy way to kind of get some idea is that for visible magnitudes for Vega, zero magnitude is roughly a thousand photons per second per square centimeter. And if you like Janskys, it's about 3,500 Janskys. All these things are tabulated in the proper places so if you actually ever need to do it, you look it up. So how do we use this? Since this is a logarithmic measure, it has to do with the ratios of fluxes that expressed as a logarithm, right? It tells you how much is one thing brighter than the other. And because there is that minus 2.5 log 10, that means that a factor of 100, which is two orders of magnitude, would be five magnitudes. So don't confuse it with orders of magnitude. Astronomical magnitude is 0.4 orders of magnitude. Right. And it goes upside down, so the higher magnitude means fainter objects. Now sun is the brightest source in our sky, and has apparent magnitude of minus 26 and change. That means it's really, really bright. And the very faintest things that we can observe, with like Hubble Space Telescope or 28th, 29th magnitude. And you can figure out how many orders of magnitude that is. Be a good thing to work out say during section. Okay, now let's make life a little more complicated. Introduce the absolute magnitude, which is now measure of luminosity, not flux. Magnitude is Is measurement of flux. It doesn't tell you how far something is, it's just how bright it appears to you. But you'd really like to know how luminous things are so that therefore you need to know how far they are. And use the steps involved in the formula and all that. But before we could actually calibrate any of that people figured out, well we can use magnitudes so we can introduce the concept of the absolute magnitude. And absolute magnitude is a apparent magnitude your source would have if you put it ten parsecs away. Now why ten and not one is anybody's guess. It just makes life more complicated so the formula is given here. Obviously it has to be proportional but then there is 5 times log of the distance. Why that? Because it's the square of the distance that matters in a square log. It's minus 2.5 times 2. So it's minus 5 log of the distance. And then there is this plus 5 offset to account for silly ten parsec zero-point. Be that as it may, the conversion is given here. If you put our Sun, star just like Sun, ten parsecs away, it will have absolute, it will have apparent magnitude around plus five. So pretty dim star but still visible with the naked eye. So Vega, which is zero magnitude, forget how far to Vega is but it's not very different from ten parsecs. So roughly speaking how much more luminous is Vega than our Sun? Anybody has a guess? Or not? Think about it for one minute. And now we can express the distance as a function of absolute and apparent magnitude. Why do we do such a silly thing? Because you can calibrate for certain type of object. What are their absolute magnitudes? Let's say stars of a given type. Because of their color, the spectrum, somehow you know that's that kind of star. We have established it's 365 times luminosity of Sun. And so by measuring their apparent magnitudes you can infer from the known absolute magnitude how far something is. In craziness in 10 parsecs and that difference is called distance modules. Sometimes you encounter it. So ideally you'd like to know absolute luminosities as in solar luminosities or ergs per second or whatever, and distances in parsecs or centimeters. But instead of that, we use these somewhat strange units.