Hackernews posts about MoE
- Trinity large: An open 400B sparse MoE model (www.arcee.ai)
- 8th Gen I3 Hits 10 TPS on DeepSeek-Coder-V2-Lite 16B Moe (old.reddit.com)
- Moebius: Modern ANSI and ASCII Art Editor (github.com)
- Fine-Tune Moe Models 12x Faster with Unsloth (unsloth.ai)
- Field Notes on Scaling Moe Expert Parallelism with DeepEP (nousresearch.com)
- Show HN: Trained an LLM to predict "What will Trump do?" (huggingface.co)
- Where to Sleep in LAX (cadence.moe)
- If I hear "design pattern" one more time, I'll go mad (purplesyringa.moe)
- Disappointing Phones (cadence.moe)
- Why is YouTube's embed cache so large? (nostr.moe)
- I ran an LLM on iOS to build another privacy focused notes app (drive.google.com)
- Lets Encrypt DNS-Persist-01; Domain Control Validation (scotthelme.co.uk)
- The Origin of Laravel – a look at v1 Beta 1 (laravelnepal.com)
- How to Favicon in 2026: Three files that fit most needs (evilmartians.com)
- Web-Git-sum – Git is not GitHub (mitxela.com)
- An AI agent published a hit piece on me (theshamblog.com)
- Discord will require a face scan or ID for full access next month (www.theverge.com)
- Moltbook (www.moltbook.com)
- ICE using Palantir tool that feeds on Medicaid data (www.eff.org)
- The Waymo World Model (waymo.com)
- 15 years later, Microsoft morged my diagram (nvie.com)