Hackernews posts about GPT-3
GPT-3 is a powerful language model developed by OpenAI that uses machine learning to generate human-like text based on the input it receives.
- Scaling Laws for LLMs: From GPT-3 to o3 (cameronrwolfe.substack.com)
- Fuzzy API composition: querying NBA stats with GPT-3 and Statmuse and LangChain (www.geoffreylitt.com)
- What happened in this GPT-3 conversation? (chat.openai.com)
- GPT-3.5 Turbo fine-tuning and API updates (openai.com)
- Smarter summaries with finetuning GPT-3.5 and chain of density (jxnl.github.io)
- Experimental tree-based writing interface for GPT-3 (github.com)
- GigaGPT: GPT-3 sized models in 565 lines of code (www.cerebras.net)
- GPT4 is up to 6 times more expensive than GPT3.5 (openai.com)
- Pi.ai LLM Outperforms Palm/GPT3.5 (inflection.ai)
- 1960s chatbot ELIZA beat OpenAI's GPT-3.5 in a recent Turing test study (arstechnica.com)
- Show HN: Jailbreaking GPT3.5 Using GPT4 (github.com)
- Why is GPT-3 15.77x more expensive for certain languages? (denyslinkov.medium.com)
- Can GPT-4 and GPT-3.5 play Wordle? (twitter.com)
- Show HN: Get advice from a GPT3-based stoic philosopher (seneca.dylancastillo.co)
- OpenAI Releases Function Calling for GPT-3.5 & GPT-4 (platform.openai.com)
- GPT-3 Creative Fiction (2020) (gwern.net)
- On the left is GPT-3.5. On the right is GPT-4 (twitter.com)
- Shortwave: A GPT-3-powered front end for Gmail (www.shortwave.com)
- We gave GPT-3.5 tools to run, write, commit, and deploy code (old.reddit.com)
- Microsoft says GPT 3.5 has 20B parameters? (arxiv.org)