Oxford University Press (OUP), Monthly Notices of the Royal Astronomical Society, 2020
Full text: Unavailable
Abstract It has been recently suggested that (i) nuclear rings in barred galaxies (including our own Milky Way) form at the radius where the shear parameter of the rotation curve reaches a minimum; (ii) the acoustic instability of Montenegro et al. is responsible for driving the turbulence and angular momentum transport in the central regions of barred galaxies. Here we test these suggestions by running simple hydrodynamical simulations in a logarithmic barred potential. Since the rotation curve of this potential is scale-free, the shear minimum theory predicts that no ring should form. We find that in contrast to this prediction, a ring does form in the simulation, with morphology consistent with that of nuclear rings in real barred galaxies. This proves that the presence of a shear-minimum is not a necessary condition for the formation of a ring. We also find that perturbations that are predicted to be acoustically unstable wind up and eventually propagate off to infinity, so that the system is actually stable. We conclude that (i) the shear-minimum theory is an unlikely mechanism for the formation of nuclear rings in barred galaxies; (ii) the acoustic instability is a spurious result and may not be able to drive turbulence in the interstellar medium, at least for the case without self-gravity. The question of the role of turbulent viscosity remains open.