Fig. 1: Overview of the ManGO framework for offline optimization.
From: Learning design-score manifold to guide diffusion models for offline optimization

a Illustration of offline optimization: it identifies optimal designs for an unknown black-box function using an offline dataset (no environment interaction), where designs represent function inputs and scores correspond to outputs. b Training a diffusion model on score-augmented data to learn the joint design-score manifold. c Fidelity estimation via unconditional samples generated by the trained ManGO model: the fidelity metric determines whether to activate inference-time scaling during conditional generation. d Bidirectional conditional generation: it leverages preferred-score or preferred-design conditions to generate corresponding designs or scores, illustrated via the self-supervised importance sampling (self-IS) method at denoising timestep t for sample i. e Conceptual illustration of ManGO: it learns on the design-score manifold to enhance out-of-distribution generation (OOG) capability, contrasted with design-space learning that struggles with OOG issues under unseen conditions [13]. f Case study on superconductor’s temperature optimization [37]: it demonstrates the superior OOG performance via ManGO versus the design-space approach (i.e., DDOM) across varying ratios of top data removal.