Hackernews posts about MoE
- Sparsely-Gated Mixture of Experts (MoE) (eli.thegreenplace.net)
- NanoMoE: Mixture-of-Experts (Moe) LLMs from Scratch in PyTorch (cameronrwolfe.substack.com)
- Sparsely-Gated Mixture of Experts (MoE) (eli.thegreenplace.net)
- Weird Circle on monitor fact fom ChatGPT (files.catbox.moe)
- Experimental release of GrapheneOS for Pixel 9a (grapheneos.social)
- Federal Government's letter to Harvard demanding changes [pdf] (www.harvard.edu)
- PHP Core Security Audit Results (thephp.foundation)
- Nintendo unveils Switch 2 ahead of June 5 launch (arstechnica.com)
- No-JavaScript Fingerprinting (noscriptfingerprint.com)
- Leaked data reveals Israeli govt campaign to remove pro-Palestine posts on Meta (www.dropsitenews.com)
- Tracing the thoughts of a large language model (www.anthropic.com)
- Google is illegally monopolizing online advertising tech, judge rules (www.nytimes.com)
- $70M in 60 Seconds: How Insider Info Helped Someone 28x Their Money (data-and-politics.ghost.io)
- Trump temporarily drops tariffs to 10% for most countries (www.cnbc.com)
- A Reddit bot drove me insane (posthuman.blog)
- Recent AI model progress feels mostly like bullshit (www.lesswrong.com)
- Show HN: I built a word game. My mom thinks it's great. What do you think? (www.whatsit.today)
- Gemma 3 QAT Models: Bringing AI to Consumer GPUs (developers.googleblog.com)
- Mozilla launching “Thundermail” email service to take on Gmail, Microsoft 365 (www.techradar.com)
- Meta antitrust trial kicks off in federal court (www.axios.com)
- AI agents: Less capability, more reliability, please (www.sergey.fyi)
- Reasoning models don't always say what they think (www.anthropic.com)