Hackernews posts about Transformer
Transformer is a type of neural network architecture designed for natural language processing tasks that relies on self-attention mechanisms to process input sequences in parallel.
Related:
Apple
- Electrical transformer manufacturing is throttling the electrified future (www.bloomberg.com)
- Transformers Are Bayesian Networks (arxiv.org)
- CSV to Transformer, via Bayesian Networks (github.com)
- Gemma 4 running locally in the browser with transformers.js (huggingface.co)
- Show HN: Javadecompiler.org – a unified Java decompiler and transformer API (javadecompiler.org)
- Nemotron 3 Super: An Open Hybrid Mamba-Transformer Moe for Agentic Reasoning (developer.nvidia.com)
- Transformer.js v4 (huggingface.co)
- Playing Doom on a SUBLEQ Transformer (xcancel.com)
- The Cursive Transformer (greydanus.github.io)
- Vision Transformers (www.vizuaranewsletter.com)
- The Transformer Architecture, Visualized (www.vizuaranewsletter.com)
- Transformers as Constrained Optimization (jiha-kim.github.io)
- Mamba 3 matches Transformer performance at reduced latency (venturebeat.com)
- Transformers from Scratch (www.brandonrohrer.org)