Fred Zhangzhi Peng (@pengzhangzhi1) 's Twitter Profile
Fred Zhangzhi Peng

@pengzhangzhi1

#ML & #ProteinDesign. PhD student @DukeU.

ID: 1505051724796481536

linkhttps://pengzhangzhi.github.io/home/ calendar_today19-03-2022 05:21:19

214 Tweet

455 Followers

581 Following

Pranam Chatterjee (@pranamanam) 's Twitter Profile Photo

It's tough times for science. 🥺 But we have to keep innovating to fight another day, and today I'm so proud to share Fred Zhangzhi Peng's new, groundbreaking sampling algorithm for generative language models, Path Planning (P2). 🌟 📜: arxiv.org/abs/2502.03540 💻: In the appendix!!

It's tough times for science. 🥺 But we have to keep innovating to fight another day, and today I'm so proud to share <a href="/pengzhangzhi1/">Fred Zhangzhi Peng</a>'s new, groundbreaking sampling algorithm for generative language models, Path Planning (P2). 🌟

📜: arxiv.org/abs/2502.03540
💻: In the appendix!!
Fred Zhangzhi Peng (@pengzhangzhi1) 's Twitter Profile Photo

concurrent work that studies the decoding order of MDM. Cool results in logic puzzles. Feel amazing to see the same conclusion drawn from two works.

Biology+AI Daily (@biologyaidaily) 's Twitter Profile Photo

PTM-Mamba: a PTM-aware protein language model with bidirectional gated Mamba blocks Nature Methods 1. PTM-Mamba is the first protein language model explicitly designed to encode post-translational modifications (PTMs), using a novel bidirectional gated Mamba architecture fused

PTM-Mamba: a PTM-aware protein language model with bidirectional gated Mamba blocks <a href="/naturemethods/">Nature Methods</a> 

1. PTM-Mamba is the first protein language model explicitly designed to encode post-translational modifications (PTMs), using a novel bidirectional gated Mamba architecture fused
Nature Methods (@naturemethods) 's Twitter Profile Photo

PTM-Mamba: a post-translational modification-aware protein language model, for protein modeling and design. nature.com/articles/s4159…

DailyHealthcareAI (@aipulserx) 's Twitter Profile Photo

How can protein language models incorporate post-translational modifications (PTMs) to better represent the functional diversity of the proteome?Nature Methods Duke University "PTM-Mamba: a PTM-aware protein language model with bidirectional gated Mamba blocks" Authors: Fred Zhangzhi Peng

How can protein language models incorporate post-translational modifications (PTMs) to better represent the functional diversity of the proteome?<a href="/naturemethods/">Nature Methods</a> <a href="/DukeU/">Duke University</a> 

"PTM-Mamba: a PTM-aware protein language model with bidirectional gated Mamba blocks"

Authors: <a href="/pengzhangzhi1/">Fred Zhangzhi Peng</a>
Jiaxin Shi (@thjashin) 's Twitter Profile Photo

We are hiring a student researcher at Google DeepMind to work on fundamental problems in discrete generative modeling! Examples of our recent work: masked diffusion: arxiv.org/abs/2406.04329 learning-order AR: arxiv.org/abs/2503.05979 If you find this interesting, please send an

Fred Zhangzhi Peng (@pengzhangzhi1) 's Twitter Profile Photo

Super exited about Singapore and ICLR2025. Will present my work on masked diffusion models at ICLR Nucleic Acids Workshop #DeLTa #FPI and GEMBio Workshop. Please stop by posters and chat about MDMs and protein design :)

Fred Zhangzhi Peng (@pengzhangzhi1) 's Twitter Profile Photo

FlashAttention-accelerated Protein Language Models ESM2 now supports Huggingface. One line change, up to 70% faster and 60% less memory! 🧬⚡ Huggingface: huggingface.co/fredzzp/esm2_t… Github: github.com/pengzhangzhi/f…

FlashAttention-accelerated Protein Language Models ESM2 now supports Huggingface. One line change, up to 70% faster and 60% less memory! 🧬⚡

Huggingface: huggingface.co/fredzzp/esm2_t…
Github: github.com/pengzhangzhi/f…