Anirudh Ravula
Anirudh Ravula
Google Research
Verified email at
Cited by
Cited by
Big Bird: Transformers for Longer Sequences.
M Zaheer, G Guruganesh, KA Dubey, J Ainslie, C Alberti, S Ontanon, ...
34th Conference on Neural Information Processing Systems (NeurIPS 2020 …, 2020
ETC: Encoding long and structured inputs in transformers
J Ainslie, S Ontanon, C Alberti, V Cvicek, Z Fisher, P Pham, A Ravula, ...
Proceedings of the 2020 Conference on Empirical Methods in Natural Language …, 2020
RealFormer: Transformer Likes Residual Attention
R He, A Ravula, B Kanagal, J Ainslie
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 …, 2021
WebFormer: The Web-page Transformer for Structure Information Extraction
Q Wang, Y Fang, A Ravula, F Feng, X Quan, D Liu
Proceedings of the 31st international conference on World Wide Web, 2022, 2022
DOCENT: Learning Self-Supervised Entity Representations from Large Document Collections
Y Zemlyanskiy, S Gandhe, R He, B Kanagal, A Ravula, J Gottweis, F Sha, ...
Proceedings of the 16th Conference of the European Chapter of the …, 2021
Deep partial multiplex network embedding
Q Wang, Y Fang, A Ravula, R He, B Shen, J Wang, X Quan, D Liu
Companion Proceedings of the Web Conference 2022, 1053-1062, 2022
The system can't perform the operation now. Try again later.
Articles 1–6