Home
Scholarly Works
The effect of radiative cooling on the X-ray...
Journal article

The effect of radiative cooling on the X-ray properties of galaxy clusters

Abstract

In this paper, we investigate the effect of cooling on the X-ray properties of galaxy clusters. We have performed N-body, hydrodynamical simulations both with and without the effects of radiative cooling, but neglecting the effects of star formation and feedback. We show that radiative cooling produces an inflow of high-entropy gas from the outer parts of the cluster, thus raising the cluster temperature and decreasing the X-ray luminosity. With radiative cooling clusters are on average from three to five times less luminous in X-rays than the same clusters simulated without cooling. However, we do not produce a large constant-density core in either the gas or the dark matter distributions. Our results contradict previous work in which cooling raises the X-ray luminosity and deposits an unreasonably large amount of mass in the central cluster galaxy. We achieve this by selecting our numerical resolution in such a way that a reasonable fraction of the baryonic material cools and by decoupling the hot and cold gas in our simulations, a first step towards modelling multiphase gas. We emphasize that globally cooling a sensible amount of material is vital and the presence or absence of massive central concentrations of cold baryonic material has a dramatic effect upon the resultant X-ray properties of the clusters.

Authors

Pearce FR; Thomas PA; Couchman HMP; Edge AC

Journal

Monthly Notices of the Royal Astronomical Society, Vol. 317, No. 4, pp. 1029–1040

Publisher

Oxford University Press (OUP)

Publication Date

October 1, 2000

DOI

10.1046/j.1365-8711.2000.03773.x

ISSN

0035-8711

Contact the Experts team