Hackernews posts about GPT-3
GPT-3 is a powerful language model developed by OpenAI that uses machine learning to generate human-like text based on the input it receives.
- Show HN: XtyleAI – Get Your Perfect Hairstyle by AI (www.xtyle.ai)
- What happened in this GPT-3 conversation? (chat.openai.com)
- GPT-3.5 Turbo fine-tuning and API updates (openai.com)
- Placing #1 in Advent of Code with GPT-3 (github.com)
- Show HN: GPT-3 Powered Shell (musings.yasyf.com)
- Use GPT-3 incorrectly: reduce costs 40x and increase speed by 5x (www.buildt.ai)
- Show HN: Turning books into chatbots with GPT-3 (www.konjer.xyz)
- New GPT-3 model: text-DaVinci-003 (beta.openai.com)
- Smarter summaries with finetuning GPT-3.5 and chain of density (jxnl.github.io)
- Experimental tree-based writing interface for GPT-3 (github.com)
- GigaGPT: GPT-3 sized models in 565 lines of code (www.cerebras.net)
- Microsoft Teams Premium: powered by OpenAI’s GPT-3.5 (www.microsoft.com)
- GPT-3 is the best journal I’ve used (every.to)
- Anki and GPT-3 (andrewjudson.com)
- GPT4 is up to 6 times more expensive than GPT3.5 (openai.com)
- Using GPT3 to Interpret Dreams (www.nightcap.guru)
- How to implement Q&A against your docs with GPT3 embeddings and Datasette (simonwillison.net)
- Call GPT-3 from Terminal (github.com)
- Pi.ai LLM Outperforms Palm/GPT3.5 (inflection.ai)