Babak Ehteshami Bejnordi
@babakeht
Research Scientist@Qualcomm AI Research: Deep learning, Conditional computation, Model Efficiency, LLM/Vision
ID: 887514666
17-10-2012 20:38:15
107 Tweet
347 Followers
278 Following
        Today we’re joined by Babak Ehteshami Bejnordi (Babak Ehteshami Bejnordi), a Research Scientist at Qualcomm Follow us at @QCOMResearch, to discuss a few papers, including 'Conditional Channel Gated Networks for Task-Aware Continual Learning,' from last week's CVPR conference. twimlai.com/twiml-talk-385…
        I was interviewed by The TWIML AI Podcast and we discussed our recent works at Qualcomm #AI Research on conditional computation using gated neural nets. Thank you Sam Charrington! Here is the link: twimlai.com/twiml-talk-385…
                        
                    
                    
                    
                
        
        Very excited to present "Natural Graph Networks" at NeurIPS next week. We use naturality - a generalisation of equivariance from category theory - to build an equivariant CNN that works on any graph. arxiv.org/abs/2007.08349 with Taco Cohen Max Welling
        
        
        
        
        
        
        
        
        
        
        We propose a dynamic tokenizer for ViTs, where the scale at which an image is processed varies based on the complexity of the image area. This means less computing for simple areas and more for complex, cluttered areas. Thanks to Amelie Royer, Jakob Havtorn, Tijmen Blankevoort
        QIF Europe is an excellence award through which Qualcomm rewards and mentors the most innovative PhD students in Europe working on breakthrough #AI and #cybersecurity solutions. Congratulations Tycho van der Ouderaa Karsten Roth Siwei Zhang and Attri Bhattacharyya qualcomm.com/news/releases/…
                        
                    
                    
                    
                
        
        
        
        Our paper got a prize :) Cheers to lead author Johann Brehmer, and fellow co-authors Sönke Behrends, and Taco Cohen. Our results hint that yes, also at large scale of data and compute, if your data has symmetries, you might be better off building these into your network.