Examining the molecular clock hypothesis for the contemporary evolution of the rabies virus.
Journal Articles
Overview
Research
Identity
Additional Document Info
View All
Overview
abstract
The molecular clock hypothesis assumes that mutations accumulate on an organism's genome at a constant rate over time, but this assumption does not always hold true. While modelling approaches exist to accommodate deviations from a strict molecular clock, assumptions about rate variation may not fully represent the underlying evolutionary processes. There is considerable variability in rabies virus (RABV) incubation periods, ranging from days to over a year, during which viral replication may be reduced. This prompts the question of whether modelling RABV on a per infection generation basis might be more appropriate. We investigate how variable incubation periods affect root-to-tip divergence under per-unit time and per-generation models of mutation. Additionally, we assess how well these models represent root-to-tip divergence in time-stamped RABV sequences. We find that at low substitution rates (<1 substitution per genome per generation) divergence patterns between these models are difficult to distinguish, while above this threshold differences become apparent across a range of sampling rates. Using a Tanzanian RABV dataset, we calculate the mean substitution rate to be 0.17 substitutions per genome per generation. At RABV's substitution rate, the per-generation substitution model is unlikely to represent rabies evolution substantially differently than the molecular clock model when examining contemporary outbreaks; over enough generations for any divergence to accumulate, extreme incubation periods average out. However, measuring substitution rates per-generation holds potential in applications such as inferring transmission trees and predicting lineage emergence.