AI Unfiltered

Chinese AI • Open Source • Security • Incidents. Signal, not noise.

[AINews] Why OpenAI Should Build Slack

a quiet day lets us answer a Sam Altman question

[AINews] new Gemini 3 Deep Think, Anthropic $30B @ $380B, GPT-5.3-Codex Spark, MiniMax M2.5

There's too much going on!

Custom Kernels for All from Codex and Claude

Owning the AI Pareto Frontier — Jeff Dean

From rewriting Google’s search stack in the early 2000s to reviving sparse trillion-parameter models and co-designing TPUs with frontier ML research, Jeff Dean has quietly shaped nearly every layer of the modern AI stack.

Agentic Test-Time Scaling for WebAgents

Test-time scaling has become a standard way to improve performance and boost reliability of neural network models. However, its behavior on agentic, multi-step tasks remains less well-understood: small per-step errors can compound over long horizons; and we find that naive policies that uniformly...

Self-Supervised Learning via Flow-Guided Neural Operator on Time-Series Data

Self-supervised learning (SSL) is a powerful paradigm for learning from unlabeled time-series data. However, popular methods such as masked autoencoders (MAEs) rely on reconstructing inputs from a fixed, predetermined masking ratio. Instead of this static design, we propose treating the corruption...

Think like a Scientist: Physics-guided LLM Agent for Equation Discovery

Explaining observed phenomena through symbolic, interpretable formulas is a fundamental goal of science. Recently, large language models (LLMs) have emerged as promising tools for symbolic equation discovery, owing to their broad domain knowledge and strong reasoning capabilities. However, most...

The Observer Effect in World Models: Invasive Adaptation Corrupts Latent Physics

Determining whether neural models internalize physical laws as world models, rather than exploiting statistical shortcuts, remains challenging, especially under out-of-distribution (OOD) shifts. Standard evaluations often test latent capability via downstream adaptation (e.g., fine-tuning or...

WaveFormer: Wavelet Embedding Transformer for Biomedical Signals

Biomedical signal classification presents unique challenges due to long sequences, complex temporal dynamics, and multi-scale frequency patterns that are poorly captured by standard transformer architectures. We propose WaveFormer, a transformer architecture that integrates wavelet decomposition at...

[AINews] Z.ai GLM-5: New SOTA Open Weights LLM

We have Opus 4.5 at home

🔬Science at the speed of inference — Gabriele Corso & Jeremy Wohlwend, Boltz

Inside Boltz, AlphaFold’s Legacy, and the Tools Powering Next-Gen Molecular Discovery

OpenEnv in Practice: Evaluating Tool-Using Agents in Real-World Environments

Opus 4.6, Codex 5.3, and the post-benchmark era

On comparing models in 2026.

Transformers.js v4 Preview: Now Available on NPM!

Introducing SyGra Studio

Why Nvidia builds open models with Bryan Catanzaro

Interconnects interview #17 on the past, present, and future of the Nemotron project.

Nemotron ColEmbed V2: Raising the Bar for Multimodal Retrieval with ViDoRe V3’s Top Model

Latest open artifacts (#18): Arcee's 400B MoE, LiquidAI's underrated 1B model, new Kimi, and anticipation of a busy month

Tons of useful "niche" models and anticipation of big releases coming soon.