Follow
Hao Yu
Hao Yu
Verified email at lamda.nju.edu.cn - Homepage
Title
Cited by
Cited by
Year
A unified pruning framework for vision transformers
H Yu, J Wu
Science China Information Sciences 66 (7), 179101, 2023
572023
Training vision transformers with only 2040 images
YH Cao, H Yu, J Wu
European Conference on Computer Vision, 220-237, 2022
522022
Mixup without hesitation
H Yu, H Wang, J Wu
Image and Graphics: 11th International Conference, ICIG 2021, Haikou, China …, 2021
322021
Compressing transformers: features are low-rank, but weights are not!
H Yu, J Wu
Proceedings of the AAAI Conference on Artificial Intelligence 37 (9), 11007 …, 2023
222023
Fast k-means clustering with Anderson acceleration
J Zhang, Y Yao, Y Peng, H Yu, B Deng
arXiv preprint arXiv:1805.10638, 2018
102018
Effectively Compress KV Heads for LLM
H Yu, Z Yang, S Li, Y Li, J Wu
arXiv preprint arXiv:2406.07056, 2024
52024
Reviving Undersampling for Long-Tailed Learning
H Yu, Y Du, J Wu
arXiv preprint arXiv:2401.16811, 2024
12024
Quantization without Tears
M Fu, H Yu, J Shao, J Zhou, K Zhu, J Wu
arXiv preprint arXiv:2411.13918, 2024
2024
Unified Low-rank Compression Framework for Click-through Rate Prediction
H Yu, M Fu, J Ding, Y Zhou, J Wu
Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and …, 2024
2024
The system can't perform the operation now. Try again later.
Articles 1–9