Hackernews posts about MoE
- Slicing an 80B MoE LLM into 40B domain specialists (github.com)
- Tracing a Full MoE Training Step Through the XLA Compiler (patricktoulme.substack.com)
- Unlocking LoRA Moe RL for Qwen3.5 (osmosis.ai)
- Mixture of Experts (Moe), Visually Explained (www.youtube.com)
- Better MoE model inference with warp decode (cursor.com)
- Nemotron 3 Super: An Open Hybrid Mamba-Transformer Moe for Agentic Reasoning (developer.nvidia.com)
- An ode to bzip (purplesyringa.moe)
- Optimization lessons from a Minecraft structure locator (purplesyringa.moe)
- An Ode to Bzip (purplesyringa.moe)
- Institutional parasitism in open technology communities [pdf] (files.catbox.moe)
- Apparently Opus 4.6 has solved erdos' prime divisibility conjecture [pdf] (files.catbox.moe)