Hackernews posts about ComfyUI
- Show HN: AI Image Super-Resolution (InvSR) (github.com)
- Show HN: Comflowy – A ComfyUI Tutorial for Beginners (www.comflowy.com)
- ComfyUI V1: a seamless desktop experience for ComfyUI (blog.comfy.org)
- Show HN: Self-hosted gateway to access LLMs, Ollama, ComfyUI and FFmpeg servers (openai.servicestack.net)
- Show HN: Self Hosted AI Server Gateway for Ollama, ComfyUI and FFmpeg Servers (openai.servicestack.net)
- Hosting a ComfyUI Workflow via API (9elements.com)
- Show HN: Self-hosted gateway to access LLMs, Ollama, ComfyUI and FFmpeg servers (openai.servicestack.net)
- ComfyUI Custom Node for OpenAI (github.com)
- ComfyUI supports masking and scheduling of LoRA and model weights (blog.comfy.org)
- Show HN: Open-source tool to convert ComfyUI workflows into web apps and APIs (playground.viewcomfy.com)
- ComfyUI Now Supports Stable Diffusion 3.5 (blog.comfy.org)
- Running Mochi (SOTA video generation) in ComfyUI on one 4090 GPU (blog.comfy.org)
- Show HN: Self-Hosted Gateway for Ollama, LLM APIs, ComfyUI and FFmpeg servers (openai.servicestack.net)
- Show HN: RunComfy – ComfyUI cloud and guaranteed working AI video workflows (www.runcomfy.com)
- ComfyUI Segment Anything (github.com)
- ComfyUI-Fluxtapoz (github.com)
- Self Hosted AI Server Gateway for Ollama, ComfyUI and FFmpeg Servers (openai.servicestack.net)
- Trending Workflows for ComfyUI (github.com)
- ComfyUI statement on the Ultralytics crypto miner situation (blog.comfy.org)
- Show HN: Self-Hosted AI Server for LLM APIs, Ollama, ComfyUI and FFmpeg Servers (openai.servicestack.net)
- Show HN: Deploy ComfyUI workflows as a web app with no code (playground.viewcomfy.com)