Filter By:

Journal Check one or more journals to show results from those journals only.

Choose more journals

Article type Check one or more article types to show results from those article types only.
Subject Check one or more subjects to show results from those subjects only.
Date Choose a date option to show results from those dates only.

Custom date range

Clear all filters
Sort by:
Showing 1–11 of 11 results
Advanced filters: Author: Anima Anandkumar Clear advanced filters
  • Enzymes are highly selective and sustainable catalysts for chemical synthesis, but their optimization is often limited by the difficulty of identifying functional starting points. This study shows that using the GenSLM protein language model to design TrpB variants can yield stable, active enzymes with broad substrate promiscuity, outperforming natural and evolved counterparts and demonstrating the potential of generative models to accelerate biocatalyst discovery.

    • Théophile Lambert
    • Amin Tavakoli
    • Frances H. Arnold
    ResearchOpen Access
    Nature Communications
    P: 1-12
  • Shengchao Liu et al. present ProteinDT, a deep learning approach that can incorporate domain knowledge from textual descriptions into protein representation on a large scale.

    • Shengchao Liu
    • Yanjing Li
    • Anima Anandkumar
    Research
    Nature Machine Intelligence
    Volume: 7, P: 580-591
  • Combinatorial optimization problems might be solvable more efficiently with quantum computing, but near-term quantum optimization solvers are limited in qubit size. Here, the authors propose and demonstrate a variational quantum algorithm that polynomially reduces the qubit overhead for applications like MaxCut.

    • Marco Sciorilli
    • Lucas Borges
    • Leandro Aolita
    ResearchOpen Access
    Nature Communications
    Volume: 16, P: 1-9
  • Machine learning methods in cheminformatics have made great progress in using chemical structures of molecules, but a large portion of textual information remains scarcely explored. Liu and colleagues trained MoleculeSTM, a foundation model that aligns the structure and text modalities through contrastive learning, and show its utility on the downstream tasks of structure–text retrieval, text-guided editing and molecular property prediction.

    • Shengchao Liu
    • Weili Nie
    • Animashree Anandkumar
    Research
    Nature Machine Intelligence
    Volume: 5, P: 1447-1457
  • Great advances in protein structure prediction have been made with recent deep learning-based methods, but proteins interact with their environment and can change shape drastically when binding to ligand molecules. To predict the 3D structure of these combined protein–ligand complexes, Qiao et al. developed a generative diffusion model with biophysical constraints and geometric deep learning.

    • Zhuoran Qiao
    • Weili Nie
    • Animashree Anandkumar
    Research
    Nature Machine Intelligence
    Volume: 6, P: 195-208
  • Neural operators learn mappings between functions on continuous domains, such as spatiotemporal processes and partial differential equations, offering a fast, data-driven surrogate model solution for otherwise intractable numerical simulations of complex real-world problems.

    • Kamyar Azizzadenesheli
    • Nikola Kovachki
    • Anima Anandkumar
    Reviews
    Nature Reviews Physics
    Volume: 6, P: 320-328
  • The advances in artificial intelligence over the past decade are examined, with a discussion on how artificial intelligence systems can aid the scientific process and the central issues that remain despite advances.

    • Hanchen Wang
    • Tianfan Fu
    • Marinka Zitnik
    Reviews
    Nature
    Volume: 620, P: 47-60