Home
Scholarly Works
Decoder Transformer for Temporally-Embedded Health...
Conference

Decoder Transformer for Temporally-Embedded Health Outcome Predictions

Abstract

Deep learning models are increasingly being used to predict patients’ diagnoses by analyzing electronic health records. Medical records represent observations of a patient’s health over time. A commonly used approach to analyze health records is to encode them as a sequence of ordered diagnoses (diagnostic-level encoding). Transformer models then analyze the sequence of diagnoses to learn disease patterns. However, the elapsed time between medical visits is not considered when transformers are used to analyze health records. In this paper, we present DT-THRE: Decoder Transformer for Temporally-Embedded Health Records Encoding that predicts patients’ diagnoses by analyzing their medical histories. In DTTHRE, instead of diagnostic-level encoding, we propose an encoding representation for health records called THRE: Temporally-Embedded Health Records Encoding. THRE encodes patient histories as a sequence of medical events such as age, sex, and diagnostic embedding while incorporating the elapsed time between visits. We evaluate a proof-of-concept DTTHRE on a real-world medical dataset and compare our model’s performance to an existing diagnostic transformer model in the literature. DTTHRE was successful on a medical dataset to predict patients’ final diagnosis with improved predictive performance (78.54± 0.22%) compared to the existing model in the literature (40.51± 0.13%).

Authors

Boursalie O; Samavi R; Doyle TE

Volume

00

Pagination

pp. 1461-1467

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

December 16, 2021

DOI

10.1109/icmla52953.2021.00235

Name of conference

2021 20th IEEE International Conference on Machine Learning and Applications (ICMLA)
View published work (Non-McMaster Users)

Contact the Experts team