machine learning

Inference Speed Soars 60×: DiDi-Instruct Lets Diffusion LLMs Beat Thousand-Step GPT in Just 16 Steps

DiDi-Instruct

Inference Speed Soars 60×: DiDi-Instruct Lets Diffusion LLMs Beat Thousand-Step GPT in Just 16 Steps

2025-10-27 13:21 Beijing DiDi-Instruct introduces a breakthrough approach for extreme acceleration, alignment, and reinforcement learning in large-scale language models. --- Overview A collaboration between Purdue University, University of Texas, National University of Singapore, Morgan Stanley ML Research, and Xiaohongshu Hi-Lab has yielded a post-training method for Discrete Diffusion Large

By Honghao Wang
Multimodal AI Models Learn Reflection and Review — SJTU & Shanghai AI Lab Tackle Complex Multimodal Reasoning

Multimodal AI

Multimodal AI Models Learn Reflection and Review — SJTU & Shanghai AI Lab Tackle Complex Multimodal Reasoning

Multimodal AI Breakthrough: MM-HELIX Enables Long-Chain Reflective Reasoning Multimodal large models are becoming increasingly impressive — yet many users feel frustrated by their overly direct approach. Whether generating code, interpreting charts, or answering complex questions, many multimodal large language models (MLLMs) jump straight to a final answer without reconsideration. Like an

By Honghao Wang
Drink Some VC | YC Talks with Anthropic’s Head of Pretraining: Pretraining Teams Must Also Consider Inference, Balancing Pretraining and Post‑Training Still in Early Exploration

AI scaling

Drink Some VC | YC Talks with Anthropic’s Head of Pretraining: Pretraining Teams Must Also Consider Inference, Balancing Pretraining and Post‑Training Still in Early Exploration

# Y Combinator Conversation with Nick Joseph — Scaling Laws, Compute, and the Future of AI **Date:** 2025‑10‑16 11:01 (Beijing) > *“Scaling laws show that as you put in more compute, data, and parameters, model loss decreases in a predictable way — this is the core engine driving AI progress.

By Honghao Wang
Terminal-Based Intelligent Coding Assistant: Fully Open Source and Service-Independent | Open Source Daily No.759

Open Source

Terminal-Based Intelligent Coding Assistant: Fully Open Source and Service-Independent | Open Source Daily No.759

OpenCode — Terminal-Based AI Coding Agent Repository: sst/opencode Stars: 25.0k License: MIT OpenCode is a fully open-source terminal AI coding agent built for speed, flexibility, and collaboration. Key Features * Multiple backend support: Anthropic, OpenAI, Google, and local models. * Terminal User Interface (TUI) focus: Created by Neovim users and terminal.

By Honghao Wang
Users annoyed into turning off Instagram notifications? Meta: We’ve reflected — using AI to limit ourselves

Instagram

Users annoyed into turning off Instagram notifications? Meta: We’ve reflected — using AI to limit ourselves

Instagram Introduces “Diversity Algorithms” in New Machine Learning Framework to Boost Engagement Meta has rolled out a new machine learning ranking framework for Instagram, aiming to reduce repetitive notifications through diversity algorithms. This upgrade seeks to combat notification fatigue, ensure content variety, and maintain high engagement levels. The system addresses

By Honghao Wang