We consider a family of globally stationary (horizonless), asymptotically
flat solutions of five-dimensional supergravity. We prove that massless linear
scalar waves in such soliton spacetimes cannot have a uniform decay rate faster
than inverse logarithmically in time. This slow decay can be attributed to the
stable trapping of null geodesics. Our proof uses the construction of
quasimodes which are time periodic approximate solutions to the wave equation.
The proof is based on previous work to prove an analogous result in Kerr-AdS
black holes \cite{holzegel:2013kna}. We remark that this slow decay is
suggestive of an instability at the nonlinear level.