Hackernews posts about GPT3
GPT3 is a revolutionary AI model that uses machine learning to generate human-like text based on input prompts, allowing for applications such as language translation, content creation, and natural language processing.
- Show HN: Spress – A vibe coded programming language (github.com)
- Simon Willison's first blog on LLMs (2022) (simonwillison.net)
- Sycophancy in GPT-4o (openai.com)
- Show HN: GPT-2 implemented using graphics shaders (github.com)
- The end of an AI that shocked the world: OpenAI retires GPT-4 (arstechnica.com)
- Could GPT help with dating anxiety? (scottaaronson.blog)
- Simple GPT in pure Go, trained on Jules Verne books (github.com)
- Simple GPT in pure Go, trained on Jules Verne books (github.com)
- Simple GPT in pure Go, trained on Jules Verne books (github.com)
- A comparison of ChatGPT/GPT-4o's previous and current system prompts. GPT (simonwillison.net)
- GPT Destroyed College Camaraderie (medium.com)
- Show HN: I built an AI at 16 that writes full ebooks in minutes (GPT-4) (www.quicktome-ai.xyz)
- Why Aren't We Talking More About GPT-4.1? (blog.kilocode.ai)
- GPT 4.1 Prompting Guide (cookbook.openai.com)
- Show HN: I Built Remind Me AI. It's Like Unlimited GPT Tasks. Try the Demo (app.arcade.software)
- The conversational persuasiveness of GPT-4 (www.nature.com)