Charlie S. Burlingham (@csburlingham) 's Twitter Profile
Charlie S. Burlingham

@csburlingham

Vision Scientist, Meta Reality Labs

ID: 1385671341852999686

linkhttps://csb0.github.io/ calendar_today23-04-2021 19:06:47

130 Tweet

184 Followers

250 Following

Yasasi (@yasasi_abey) 's Twitter Profile Photo

PETMEI Workshop at ETRA 2024 kicked off with the keynote speech by Michael J. Proulx from Reality Labs at Meta. Insightful speech on pervasive eye tracking challenges for interactions in #ExtendedReality. petmei.org/2024/ #ETRA2024

PETMEI Workshop at <a href="/ETRA_conference/">ETRA</a> 2024 kicked off with the keynote speech by <a href="/MichaelProulx/">Michael J. Proulx</a> from <a href="/RealityLabs/">Reality Labs at Meta</a>. Insightful speech on pervasive eye tracking challenges for interactions in #ExtendedReality.

petmei.org/2024/
#ETRA2024
Yasasi (@yasasi_abey) 's Twitter Profile Photo

.Charlie S. Burlingham from Reality Labs at Meta is now presenting their paper titled, "Real-World Scanpaths Exhibit Long-Term Temporal Dependencies: Considerations for Contextual AI for AR Applications" at PETMEI Workshop at #ETRA2024. Paper: doi.org/10.1145/364990…

.<a href="/csburlingham/">Charlie S. Burlingham</a> from <a href="/RealityLabs/">Reality Labs at Meta</a> is now presenting their paper titled, "Real-World Scanpaths Exhibit Long-Term Temporal Dependencies: Considerations for Contextual AI for AR Applications" at PETMEI Workshop at #ETRA2024.

Paper: doi.org/10.1145/364990…
Yasasi (@yasasi_abey) 's Twitter Profile Photo

Paper Session 1: Visual Attention ETRA 2024 just started. Oleg Komogortsev from Reality Labs at Meta and Texas State University is now presenting their paper on "Per-Subject Oculomotor Plant Mathematical Models and the Reliability of Their Parameters" at #ETRA2024. Paper: doi.org/10.1145/3654701

Paper Session 1: Visual Attention <a href="/ETRA_conference/">ETRA</a> 2024 just started. <a href="/olegkomo/">Oleg Komogortsev</a> from <a href="/RealityLabs/">Reality Labs at Meta</a> and <a href="/txst/">Texas State University</a> is now presenting their paper on "Per-Subject Oculomotor Plant Mathematical Models and the Reliability of Their Parameters" at #ETRA2024.

Paper: doi.org/10.1145/3654701
Leah Banellis (@leahbanellis) 's Twitter Profile Photo

Got Butterflies in your Stomach?šŸ˜µā€šŸ’«I am super excited to share the first major study of my postdoc The ECG! We report a multidimensional mental health signature of stomach-brain coupling in the largest sample to date šŸ§µšŸ‘‡biorxiv.org/content/10.110…

Michael J. Proulx (@michaelproulx) 's Twitter Profile Photo

All-day AR would benefit from AI models that understand a person's context & eye tracking could be key for task recognition. Yet past work - including our own research.facebook.com/publications/c… - hasn't found much added value from gaze in addition to computer vision & egocentric video 2/

Airi Yoshimoto (@airisyoshimoto) 's Twitter Profile Photo

I’m very excited to share that my graduate work is now online in Science Magazine today! With generous help from my mentor ę± č°·č£•äŗŒ and my amazing teammates, we investigated a top-down pathway for volitional heart rate regulation! science.org/doi/10.1126/sc…

Eiko Fried (@eikofried) 's Twitter Profile Photo

So in 2007, physicists wrote a paper that made the headlines: according to their calculations, human coin flips aren’t 50/50 - more like 51/49. Why is that, and did students in Amsterdam really flip 350,000 coins to find out? 🧵

Charlie S. Burlingham (@csburlingham) 's Twitter Profile Photo

šŸŽ‰ New paper out! We show training improves motion categorization but doesn't reduce (or even worsens) misperceptions—explained via model combining efficient coding + implicit categorization + increased encoding precision journals.plos.org/ploscompbiol/a…