Follow
Nikhil Kumar Lakumarapu
Nikhil Kumar Lakumarapu
Samsung Research, Seoul
Verified email at samsung.com
Title
Cited by
Cited by
Year
End-end speech-to-text translation with modality agnostic meta-learning
S Indurthi, H Han, NK Lakumarapu, B Lee, I Chung, S Kim, C Kim
ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and …, 2020
552020
Task aware multi-task learning for speech to text tasks
S Indurthi, MA Zaidi, NK Lakumarapu, B Lee, H Han, S Ahn, S Kim, C Kim, ...
ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and …, 2021
192021
End-to-End Simultaneous Translation System for IWSLT2020 Using Modality Agnostic Meta-Learning
HJ Han, MA Zaidi, SR Indurthi, NK Lakumarapu, B Lee, S Kim
Proceedings of the 17th International Conference on Spoken Language …, 2020
152020
Data efficient direct speech-to-text translation with modality agnostic meta-learning
S Indurthi, H Han, NK Lakumarapu, B Lee, I Chung, S Kim, C Kim
arXiv preprint arXiv:1911.04283, 2019
152019
Language Model Augmented Monotonic Attention for Simultaneous Translation
SR Indurthi, MA Zaidi, NK Lakumarapu, B Lee, S Kim
Proceedings of the 2022 Conference of the North American Chapter of the …, 2022
62022
Faster re-translation using non-autoregressive model for simultaneous neural machine translation
H Han, S Indurthi, MA Zaidi, NK Lakumarapu, B Lee, S Kim, C Kim, ...
arXiv preprint arXiv:2012.14681, 2020
52020
Decision attentive regularization to improve simultaneous speech translation systems
MA Zaidi, B Lee, S Kim, C Kim
arXiv preprint arXiv:2110.15729, 2021
32021
End-to-end offline speech translation system for IWSLT 2020 using modality agnostic meta-learning
NK Lakumarapu, B Lee, SR Indurthi, HJ Han, MA Zaidi, S Kim
Proceedings of the 17th International Conference on Spoken Language …, 2020
32020
Electronic device and method for controlling the electronic device thereof
SR Indurthi, HAN Hyojung, LEE Beomseok, I Chung, NK Lakumarapu
US Patent 11,551,675, 2023
12023
Infusing future information into monotonic attention through language models
SR Indurthi, MA Zaidi, B Lee, NK Lakumarapu, S Kim
12021
Infusing future information into monotonic attention through language models
MA Zaidi, S Indurthi, B Lee, NK Lakumarapu, S Kim
arXiv preprint arXiv:2109.03121, 2021
12021
Learning without Forgetting: Task Aware Multitask Learning for Multi-Modality Tasks
SR Indurthi, MA Zaidi, NK Lakumarapu, B Lee, HJ Han, S Kim, I Hwang
2020
The system can't perform the operation now. Try again later.
Articles 1–12