Irmak Guzey (@irmakkguzey) 's Twitter Profile
Irmak Guzey

@irmakkguzey

PhD student at @CILVRatNYU, advised by @LerrelPinto. On a mission to make robotic hands as dexterous as human ones! 🤖✋. (she/her)

ID: 1522957351971606534

calendar_today07-05-2022 15:12:18

44 Tweet

541 Followers

194 Following

Anya Zorin (@anyazorin) 's Twitter Profile Photo

Super excited to present our open-source robot hand RUKA! I had a lot of fun working on this with Irmak Guzey and all our amazing collaborators: Billy Yan, Aadhithya, Lisa Kondrich, Nikhil Bhattasali, and Lerrel Pinto. Check out our website at ruka-hand.github.io

Soumith Chintala (@soumithchintala) 's Twitter Profile Photo

tendon-driven 3D-printed hand from Irmak Guzey and team at the Lerrel Pinto lab. * costs $1300 to build, compact human-profile. * the tendons are actually off-the-shelf fish-line, super strong and never break. the plastic parts break before the tendons ever do. * mountable on

The Humanoid Hub (@thehumanoidhub) 's Twitter Profile Photo

NYU researchers have introduced RUKA, an open‑source, tendon‑driven robotic hand with 15 DOF that costs only $1.3 k and can operate for 20 straight hours without any performance loss. It learns joint‑to-actuator and fingertip‑to‑actuator models from motion‑capture data.

Lerrel Pinto (@lerrelpinto) 's Twitter Profile Photo

So excited for this!!! The key technical breakthrough here is that we can control joints and fingertips of the robot **without joint encoders**. Learning from self-supervised data collection is all you need for training the humanoid hand control you see below.

Irmak Guzey (@irmakkguzey) 's Twitter Profile Photo

Last week, we introduced RUKA — a fully open-sourced, 3D-printable humanoid hand. Today, we're excited to release the software stack as well: github.com/ruka-hand/RUKA! It comes with detailed instructions for: - Calibration - Control - Teleoperation ...and more! Check it out ✌️

Mahi Shafiullah 🏠🤖 (@notmahi) 's Twitter Profile Photo

Morning, #ICRA2025 IEEE ICRA! Bring something small 🍋🍑 and have our Robot Utility Model pick it up at our EXPO demo today from 1-5 PM, between hall A2/A3! Talk and poster is right before, 11:15-12:15 in room 411. Also, DM if you want to chat 🤖 for the messy, real world!

Irmak Guzey (@irmakkguzey) 's Twitter Profile Photo

RUKA is warming up for our EXPO demo today ICRA with the help of our first-time teleoperators, Venkatesh and Peiqi Liu @ ICRA 2025 🫰 Come try teleoperating RUKA yourself from 1–5 PM, at exhibit hall! 🧤 For more info before coming -> ruka-hand.github.io :) #ICRA2025 Anya Zorin

Irmak Guzey (@irmakkguzey) 's Twitter Profile Photo

Using grounded keypoints in the environment (a) enables human-to-robot transfer of two-fingered gripper policies collected with **only in-the-wild human data**, and (b) generalizes both spatially and to different objects! Check out this new work by my colleagues to learn more!

Irmak Guzey (@irmakkguzey) 's Twitter Profile Photo

Learning task-agnostic tactile representations is very valuable for dexterity! Check out this cool work by Akash Sharma that explores this while integrating the history of tactile information. This enables highly dexterous tasks—like plug insertion with a giant hand! 😁

Irmak Guzey (@irmakkguzey) 's Twitter Profile Photo

e-Flesh is a deformable, under $5, 3D-printable tactile sensor that (a) can be any shape you want, (b) is robust against magnetic interference, and (c) is deformable! Led by Venkatesh and Raunaq Bhirangi. I can't wait to use it myself and see others using it. Check it out!

Yixuan Wang (@yxwangbot) 's Twitter Profile Photo

🤖 Does VLA models really listen to language instructions? Maybe not 👀 🚀 Introducing our RSS paper: CodeDiffuser -- using VLM-generated code to bridge the gap between **high-level language** and **low-level visuomotor policy** 🎮 Try the live demo: robopil.github.io/code-diffuser/ (1/9)

Raunaq Bhirangi (@raunaqmb) 's Twitter Profile Photo

Generalization needs data. But data collection is hard for precise tasks like plugging USBs, swiping cards, inserting plugs, and keying locks. Introducing robust, precise VisuoTactile Local (ViTaL) policies: >90% success rates from just 30 demos and 45 min of real-world RL.🧶⬇️