
Haishuo
@haishuofang
NLPer, PhD candidate at UKP Lab
ID: 850350298715443201
07-04-2017 14:11:24
7 Tweet
79 Followers
235 Following

Activation functions reduce the topological complexity of data. Best AF may be diff for diff models and diff layers, but most Transformer models use GELU. What if the model learns optimized activation functions during training? led by Haishuo with Ji Ung Lee and Iryna Gurevych

Super excited to have a second paper at #ACL2023NLP !!🎉 In this paper, we propose a new method to teach NLP in a more interactive way using SQuARE Project Super happy to contribute to improving NLP teaching :)



Furkan Şahinuç Ilia Kuznetsov Yufang Hou IBM Research Iryna Gurevych Irina Bigoulaeva rachneet Harish Tayyar Madabushi BathNLP Justus-Jonas Erker 🇪🇺 @ ACL2024 🇹🇭 Computer Science | KU Leuven Nils Reimers cohere Jerry Spanakis 🟥 🦋 gerasimoss.bsky.social Maastricht University Max Glockner Preslav Nakov MBZUAI Qian Ruan Kexin Wang Jan-Christoph Klie Juan Haladjian »DARA: Decomposition-Alignment-Reasoning Autonomous Language Agent for Question Answering over Knowledge Graphs« by Haishuo (UKP Lab), Xiaodan Zhu (ECE Queens / Ingenuity Labs) & Iryna Gurevych (15/🧵) #ACL2024NLP 📑 arxiv.org/abs/2406.07080


Meet our fellow researchers representing UKP Lab at this year's @ACLmeeting: Iryna Gurevych, Qian Ruan, Justus-Jonas Erker 🇪🇺 @ ACL2024 🇹🇭, Indraneil Paul, Fengyu Cai, Sheng Lu, Haishuo, Furkan Şahinuç, Kexin Wang, Haau-Sing Li 李 效丞, Andreas Waldis @EMNLP2024, and a very special guest from BathNLP, @Harish! #ACL2024NLP


1/12 📢New paper alert “DOCE: Finding the Sweet Spot for Execution-Based Code Generation” The work is a group effort with Patrick Fernandes, Iryna Gurevych, and Andre Martins. UKP Lab ELLIS Unit Lisbon DeepSPIN 📰: arxiv.org/pdf/2408.13745
