Hackernews posts about MoE
- Show HN: MoebiusXBIN – ASCII and text-mode art editor with custom font support (blog.glyphdrawing.club)
- From DeepSeek-MoE to R1: how expert routing and RL made the leap (www.chrishayduk.com)
- How to build a router for MOE models (www.cerebras.ai)
- Intern-S1: A 241B parameter open-source MoE multimodal model (huggingface.co)
- Show HN: WanVideo – Wan 2.2 AI Video Generation (www.wanvideo.tv)
- Pruned expert GPT-OSS 6.6B (huggingface.co)
- Pruning GPT-OSS 4.8B to 20B (232 models) (github.com)
- Building Query Compilers [pdf] (pi3.informatik.uni-mannheim.de)
- Airi: Digital Companion (github.com)
- Out-of-bound indexing behaviors in Python ecosystem (gist.github.com)
- It seems like the AI crawlers learned how to solve the Anubis challenges (social.anoxinon.de)
- PHP compile time generics: yay or nay? (thephp.foundation)
- Open models by OpenAI (openai.com)
- Genie 3: A new frontier for world models (deepmind.google)
- Study mode (openai.com)
- Modern Node.js Patterns (kashw1n.com)
- Gemma 3 270M: Compact model for hyper-efficient AI (developers.googleblog.com)
- It's time for modern CSS to kill the SPA (www.jonoalderson.com)
- Monitor your security cameras with locally processed AI (frigate.video)
- OpenMower – An open source lawn mower (github.com)