Emission measure is (usually) used in X-ray and EUV astronomy, though I suppose also in cases of optically thin radio emission. It is defined as the square of the number density of free electrons integrated over the volume of the plasma.
The flux of optically thin emission from a plasma (e.g. thermal bremmstrahlung) is directly proportional to the emission measure of the plasma multiplied by a temperature dependent cooling loss law.
In other words, when you measure the flux of X-rays from an unresolved optically thin emitter, there is a degeneracy between the electron number density and the overall plasma volume.
When you fit an X-ray spectrum with an optically thin model, the emission measure (divided by $4pi d^2$, where $d$ is the distance to the object), is a multiplicative free parameter.
Your question about calculation is extremely difficult to answer. Suppose I measure a count-rate of $N$ X-ray counts per second using some X-ray telescope (I can only assume that's what you mean by a "X-ray light curve".).
The count-rate received at the telescope depends on: the emission measure (as defined above) multiplied by a term that depends on the temperature (or temperatures) of the source, the chemical composition of the source and the adopted emission process (is it free-free thermal bremsstrahlung, a thermal plasma or something else). It is then attenuated by any intrinsic absorption in the source and any absorption between us and the source and by the distance to the source (assuming the radiation is isotropic). Finally what is detected is determined by the response of the X-ray detector to X-ray photons as a function of energy.
No comments:
Post a Comment