Hackernews posts about RWKV
- RWKV: Reinventing RNNs for the Transformer Era (arxiv.org)
- RWKV RNN: Better than ChatGPT? (github.com)
- The RWKV language model: An RNN with the advantages of a transformer (johanwind.github.io)
- How the RWKV language model works (johanwind.github.io)
- Hugging Face: integration of RWKV models in transformers (twitter.com)
- RWKV.F90: Large Language Model in Fortran (github.com)
- Windows 11 now ships with rwkv.cpp (twitter.com)
- RWKV: Reinventing RNNs for the Transformer Era (Paper Explained) (www.youtube.com)
- RWKV – An RNN with the Advantages of a Transformer (huggingface.co)
- RWKV Language Model (www.rwkv.com)
- RWKV: The Unreasonably Effective RNN Strikes Back (www.latent.space)
- RWKV.cpp ships with the latest Windows 11 system (twitter.com)
- RWKV is an RNN with GPT-level LLM performance (www.rwkv.com)
- RWKV Language Model: RNN with GPT-Level LLM Performance (www.rwkv.com)
- Show HN: I've just ported the RWKV LLM to Fortran (github.com)
- RWKV for Node.js (Using Zig and GGML) (github.com)
- RWKV (14B) vs. Transformer LLMs (Llama 2 13B) (huggingface.co)