TL;DR: With CodeChain, a pretrained large language model (LLM) can solve challenging coding problems by integrating modularity in generation samples and self-improve by employing a chain of self-revisions on representative sub-modules. CodeChain can…
TL;DR: We adapted our protein language model ProGen to optimize antibodies that bind to a protein called “CD40L”, a critical target for autoimmune disorders. We tested our AI designed antibodies in the laboratory…
Other authors include: Can Qin, Stefano Ermon, Yun Fu GlueGen was accepted by ICCV. In the rapidly advancing field of text-to-image synthesis, the remarkable progress in generating lifelike images from textual prompts has…
TLDR Generative AI methods for image generation have a wide variety of potential applications in marketing, sales, and e-commerce. With these applications in mind, the Salesforce Research team has developed several techniques based…
TL;DR: PyRCA is an open-source machine learning library specifically designed for conducting Root Cause Analysis (RCA) in IT operations. It offers a comprehensive framework that allows users to easily identify the complicated metric…
Equal contribution between Erik Nijkamp and Hiroaki Hayashi. PaperCodeTweet Abstract The family of Salesforce CodeGen models is growing with CodeGen2.5 – a small, but mighty model! While there has been a recent trend…
TLDR We trained a series of 7B LLMs named XGen-7B with standard dense attention on up to 8K sequence length for up to 1.5T tokens. We also fine tune the models on public-domain…
When you combine the linguistic fluency of an LLM with the ability to accomplish tasks and make decisions independently, generative AI is elevated to an active partner in getting work done.