The effect of radiative cooling on the X-ray properties of galaxy clusters
Abstract
In this paper, we investigate the effect of cooling on the X-ray properties
of galaxy clusters. We have performed N-body, hydrodynamical simulations both
with and without the effects of radiative cooling, but neglecting the effects
of star formation and feedback. We show that radiative cooling produces an
inflow of high-entropy gas from the outer parts of the cluster, thus
\emph{raising} the cluster temperature and \emph{decreasing} the X-ray
luminosity. With radiative cooling clusters are on average three to five times
less luminous in X-rays than the same cluster simulated without cooling.
However, we do not produce a large constant-density core in either the gas or
the dark matter distributions. Our results contradict previous work in which
cooling raises the X-ray luminosity and deposits an unreasonably large amount
of mass in the central cluster galaxy. We achieve this by selecting our
numerical resolution in such a way that a reasonable fraction of the baryonic
material cools and by decoupling the hot and cold gas in our simulations, a
first step towards modelling multiphase gas. We emphasise that globally cooling
a sensible amount of material is vital and the presence or absence of massive
central concentrations of cold baryonic material has a dramatic effect upon the
resultant X-ray properties of the clusters.