Filter By:

Journal Check one or more journals to show results from those journals only.

Choose more journals

Article type Check one or more article types to show results from those article types only.
Subject Check one or more subjects to show results from those subjects only.
Date Choose a date option to show results from those dates only.

Custom date range

Clear all filters
Sort by:
Showing 1–7 of 7 results
Advanced filters: Author: Maosong Sun Clear advanced filters
  • Xiao et al. introduce ‘capability density’, defined as capability per parameter, as a metric for evaluating large language models. They report an empirical trend, the ‘densing law’, which states that capability density doubles approximately every 3.5 months, indicating that equivalent model performance can be achieved with exponentially fewer parameters over time.

    • Chaojun Xiao
    • Jie Cai
    • Maosong Sun
    ResearchOpen Access
    Nature Machine Intelligence
    Volume: 7, P: 1823-1833
  • Primary biliary cholangitis is an autoimmune liver disease. Here, the authors show that variants in interleukin genes which potentially deregulate their expression are associated with this condition, and suggest that the IL21 signalling pathway may have a role in disease aetiology.

    • Fang Qiu
    • Ruqi Tang
    • Xiong Ma
    ResearchOpen Access
    Nature Communications
    Volume: 8, P: 1-8
  • To accelerate biomedical research process, deep-learning systems are developed to automatically acquire knowledge about molecule entities by reading large-scale biomedical data. Inspired by humans that learn deep molecule knowledge from both molecule structure and biomedical text information, the authors propose a machine reading system that bridges both types of information.

    • Zheni Zeng
    • Yuan Yao
    • Maosong Sun
    ResearchOpen Access
    Nature Communications
    Volume: 13, P: 1-11
  • Training a deep neural network can be costly but training time is reduced when a pre-trained network can be adapted to different use cases. Ideally, only a small number of parameters needs to be changed in this process of fine-tuning, which can then be more easily distributed. In this Analysis, different methods of fine-tuning with only a small number of parameters are compared on a large set of natural language processing tasks.

    • Ning Ding
    • Yujia Qin
    • Maosong Sun
    ResearchOpen Access
    Nature Machine Intelligence
    Volume: 5, P: 220-235