Hackernews posts about GPT
GPT is a type of artificial intelligence language model developed by Meta AI and OpenAI that can generate human-like text based on input prompts.
- DeepSeek v2.5 – open-source LLM comparable to GPT-4, but 95% less expensive (www.deepseek.com)
- gptel: a simple LLM client for Emacs (github.com)
- OpenAI's new "Orion" model reportedly shows small gains over GPT-4 (the-decoder.com)
- GPTs Are Maxed Out (www.thealgorithmicbridge.com)
- Qwen2.5-Coder Beats GPT-4o and Claude 3.5 Sonnet in Coding (qwenlm.github.io)
- Chinese company trained GPT-4 rival with just 2k GPUs and $3M (www.tomshardware.com)
- Ultravox: An open-weight alternative to GPT-4o Realtime (www.ultravox.ai)
- New Gemini model beats GPT 4o and Claude 3.5 Sonnet (lmarena.ai)
- Company trained GPT-4 rival with 2k GPUs –spent $3M compared to OpenAI's $80M (www.tomshardware.com)
- Training Baby GPTs in Browser (trekhleb.dev)
- GPT-4o Got an Update (twitter.com)
- Why didn't we get GPT-2 in 2005? (dynomight.net)
- AI progress has plateaued at GPT-4 level (www.theintrinsicperspective.com)
- GPT Based Football Prediction Tool (betrobot.ai)
- gptel: Mindblowing integration between Emacs and ChatGPT (www.blogbyben.com)
- I made this simple day planner app with GPTEngineer (daysimpl.com)
- GPT-4o cannot do math in base64 (nihaljn.github.io)
- GPT-4o web agent demo (theaidigest.org)
- Predicted outputs: GPT-4o inference speed-up for editing tasks (platform.openai.com)
- The Moats are in the GPT-wrappers (interjectedfuture.com)