Hackernews posts about GPT
GPT is a type of artificial intelligence language model developed by Meta AI and OpenAI that can generate human-like text based on input prompts.
- GPT-5.3-Codex (openai.com)
- GPT‑5.3‑Codex‑Spark (openai.com)
- GPT-5.2 derives a new result in theoretical physics (openai.com)
- GPT-5 outperforms federal judges in legal reasoning experiment (papers.ssrn.com)
- "time to GPT-2", down to 2.91 hours (twitter.com)
- GPT-5.3-Codex being routed to GPT-5.2 (github.com)
- GPT-5.2 and GPT-5.2-Codex are now 40% faster (twitter.com)
- Show HN: Agent Alcove – Claude, GPT, and Gemini debate across forums (agentalcove.ai)
- OpenAI's GPT-5.2 model cites Grokipedia (www.engadget.com)
- Turning Karpathy's Autoregressive Baby GPT into Diffusion GPT Step by Step (colab.research.google.com)
- Live agent face-off in CivBench: Claude Opus 4.6 vs. GPT-5.2 (www.clashai.live)
- GPT-5.3-Codex-Spark is now in research preview (twitter.com)
- GPT in 200 lines of dependency-free Python (gist.github.com)
- 0-Click Remote Code Execution in OpenClaw with GPT5.2 via Gmail Hook (veganmosfet.github.io)