ClinicalT5: A generative language model for clinical text Q Lu, D Dou, T Nguyen Findings of the Association for Computational Linguistics: EMNLP 2022, 5436-5443, 2022 | 25 | 2022 |
LNN-EL: A neuro-symbolic approach to short-text entity linking H Jiang, S Gurajada, Q Lu, S Neelam, L Popa, P Sen, Y Li, A Gray ACL, 2021 | 24 | 2021 |
Parameter-efficient domain knowledge integration from multiple sources for biomedical pre-trained language models Q Lu, D Dou, TH Nguyen Findings of the Association for Computational Linguistics: EMNLP 2021, 3855-3865, 2021 | 23 | 2021 |
Predicting patient readmission risk from medical text via knowledge graph enhanced multiview graph convolution Q Lu, TH Nguyen, D Dou Proceedings of the 44th international acm sigir conference on research and …, 2021 | 19 | 2021 |
Learning Electronic Health Records through Hyperbolic Embedding of Medical Ontologies Q Lu, N de Silva, S Kafle, J Cao, D Dou, TH Nguyen, P Sen, B Hailpern, ... Proceedings of the 10th ACM International Conference on Bioinformatics …, 2019 | 19* | 2019 |
Textual data augmentation for patient outcomes prediction Q Lu, D Dou, TH Nguyen 2021 IEEE international conference on bioinformatics and biomedicine (BIBM …, 2021 | 12 | 2021 |
Wikipedia-based entity semantifying in open information extraction Q Lu, Y Du 2017 14th iapr international conference on document analysis and recognition …, 2017 | 7 | 2017 |
Exploiting node content for multiview graph convolutional network and adversarial regularization Q Lu, N De Silva, D Dou, TH Nguyen, P Sen, B Reinwald, Y Li Proceedings of the 28th international conference on computational …, 2020 | 6* | 2020 |
Cross-lingual short-text entity linking: Generating features for neuro-symbolic methods Q Lu, S Gurajada, P Sen, L Popa, D Dou, T Nguyen Proceedings of the fourth workshop on data science with human-in-the-loop …, 2022 | 3 | 2022 |
Advancing Clinical Natural Language Processing through Knowledge-Infused Language Models Q Lu University of Oregon, 2023 | | 2023 |
Exploring Clinical NLP with Pre-trained Language Models Q Lu | | |