Dion Optimizer: Transforming Distributed AI Training Efficiency Optimizers such as Adam and AdamW have been essential to training large-scale neural networks. However, as model sizes soar into the trillions of parameters, the need for more efficient training metho... AI optimization deep learning distributed training large language models open source orthonormal updates PyTorch scalability
AlphaEvolve: AI-Powered Mathematical Discovery at Scale Google DeepMind's AlphaEvolve is moving the bar yet again in how artificial intelligence tackles mathematical discovery. Published (preprint) in November 2025 by researchers Bogdan Georgiev, e t al at... AI AlphaProof automated reasoning combinatorics DeepMind evolutionary computation FunSearch large language models machine learning mathematical discovery mathematics optimization Terence Tao
When AI Agents Misremember: How Fake Memories Put Smart Assistants at Risk What if you entrust your AI assistant with your credit card to book a flight, only to wake up and discover it has spent your money on bizarre purchases? What would you do? Panic? This unsettling possi... AI assistants AI security autonomous agents large language models memory manipulation prompt injection Web3
Sakana's AB-MCTS Unlocks AI Collective Intelligence at Inference Time Sakana AI introduces AB-MCTS (Adaptive Branching Monte Carlo Tree Search), a cutting-edge algorithm that enables multiple frontier AI models to collaborate during inference. Rather than relying solely... AB-MCTS AI collaboration ARC-AGI-2 collective intelligence inference-time scaling large language models Monte Carlo Tree Search TreeQuest
MIT's CodeSteer Helps Language Models Outsmart Complex Problems Large language models (LLMs) have dramatically changed our relationship with AI, offering impressive fluency in language understanding and generation. Yet, when these models confront tasks that demand... AI coaching algorithmic tasks artificial intelligence code generation large language models machine learning MIT research symbolic reasoning
AMD Ryzen AI Max+ Upgrade: Powering 128B-Parameter LLMs Locally on Windows PCs With AMD's latest update deploying massive language models, up to 128 billion parameters, directly on your Windows laptop is now a possible. AMD’s Ryzen AI Max+ is a breakthrough that brings state-of-... AMD context window large language models LLM deployment local AI quantization Ryzen AI Windows AI
Feedback-Driven Methods Are Transforming Prompt Engineering Prompt engineering is crucial for maximizing the capabilities of large language models (LLMs), but it has traditionally required significant manual effort and specialized know-how. As new tasks and mo... AI research automation efficiency feedback loops large language models machine learning prompt optimization