Hackernews posts about MosaicML
MosaicML is a generative artificial intelligence startup that has developed MPT-7B, a commercially-usable large language model (LLM) comparable in quality to the popular LLaMA model.
- The 'Mosaic' Method and the Value of CIA Names to U.S. Adversaries (www.lawfaremedia.org)
- From tiny to immense: Geological spotlight on the Alexander Mosaic (journals.plos.org)
- The Mosaic Communications Universe (1994) (home.mcom.com)
- Mosaics of Insight: Auditing TikTok Through Independent Data Access (www.lawfaremedia.org)
- Show HN: Restore pixelated/mosaic NSFW videos (github.com)
- Generating personalized design systems (Mosaic) (githubnext.com)
- MosaicML MPT-7B: A Commercially-Usable LLaMa-Quality Model (www.mosaicml.com)
- Training LLMs with AMD MI250 GPUs and MosaicML (www.mosaicml.com)
- Databricks Signs Definitive Agreement to Acquire MosaicML (www.databricks.com)
- Training Stable Diffusion from Scratch for <$50k with MosaicML (www.mosaicml.com)
- Databricks Signs Definitive Agreement to Acquire MosaicML (www.databricks.com)
- Cloudflare R2 and MosaicML enable training LLMs anywhere (blog.cloudflare.com)
- Databricks acquires OpenAI Competitor MosaicML for 1.3B (www.mosaicml.com)
- Databricks picks up MosaicML, an OpenAI competitor, for $1.3B (techcrunch.com)
- MosaicML Agrees to Join Databricks to Power Generative AI for All (www.mosaicml.com)
- Mosaic ML: MPT-30B-Chat (huggingface.co)
- Training Stable Diffusion from Scratch for <$50k with MosaicML (www.mosaicml.com)
- Llama2-70B with MosaicML Inference (www.mosaicml.com)
- Databricks Agrees to Acquire MosaicML, a Leading Generative AI Platform (www.databricks.com)
- Databricks buys AI darling MosaicML for $1.3B (blocksandfiles.com)
- MosaicML Joins Databricks (twitter.com)
- MosaicML launches new service in bid to challenge OpenAI on price (www.reuters.com)
- MPT-30B: Raising the bar for open-source foundation models (www.mosaicml.com)
- Training LLMs with AMD MI250 GPUs (www.mosaicml.com)
- Llama 1.3B Trained on 200B Tokens for Commercial Use (huggingface.co)
- Training Stable Diffusion from Scratch for <$50k (www.mosaicml.com)
- MosaicBERT: Pretraining Bert from Scratch for $20 (www.mosaicml.com)
- MPT-7B-StoryWriter-65k+: LLM for super long contexts (Apache 2.0) (huggingface.co)
- MPT-30B – Apache 2.0 licensed LLM (huggingface.co)
- MPT-7B-8K: 8K Context Length for Document Understanding (www.mosaicml.com)
- Training Stable Diffusion from Scratch Costs <$160k (www.mosaicml.com)
- Benchmarking Large Language Models on Nvidia H100 GPUs (www.mosaicml.com)
- Revolutionalize ML Training – MosiacML (www.mosaicml.com)
- MosaicBERT: Pretraining Bert from Scratch for $20 (www.mosaicml.com)