Hackernews posts about M2 Ultra
M2 Ultra is a powerful system-on-chip (SoC) designed by Apple that enables impressive performance and efficiency for artificial intelligence (AI) workloads, such as large language models and high-speed video processing.
- Running a 180B parameter LLM on a single Apple M2 Ultra (twitter.com)
- M3 Max Chip Around as Fast as M2 Ultra in Early Benchmark Results (www.macrumors.com)
- Apple to Power AI Features with M2 Ultra Servers (www.macrumors.com)
- Apple claims M2 Ultra "can train ML workloads, like LLMs" (old.reddit.com)
- Full F16 precision 34B Code Llama at >20T/s on M2 Ultra (twitter.com)
- Apple to Power AI Features with M2 Ultra Servers (www.macrumors.com)
- Why Intel and AMD don't make chips like the M2 Max and M2 Ultra (www.xda-developers.com)
- Apple Reportedly Building M2 Ultra and M4-Powered AI Servers (www.macrumors.com)
- Apple to Power AI Features with M2 Ultra Servers (www.macrumors.com)
- M4 Max Chip Up to 25% Faster Than M2 Ultra in First Benchmark Results (www.macrumors.com)
- M3 Ultra Is Slower Than M2 Ultra for Language Models (old.reddit.com)
- M2 Ultra runs codellama 34B F16 at 150 token per second (twitter.com)
- Apple developed a chip for the Apple car that is similar to four M2 Ultra (www.bloomberg.com)
- Llama on Mac M2 Ultra (Literally) (github.com)
- The Mac Pro and Studio won't get the M4 nod until mid-2025 (www.theverge.com)
- M2 Macs cannot power more than 3 USB/Thunderbolt devices (old.reddit.com)