π€ AI MODELS
Mistral 3 Model Family Release
6x SOURCES π
π
2025-12-02
β‘ Score: 9.0
+++ Mistral shipped a full stack from 3B to 675B parameters under Apache 2.0, proving that competitive open models now span every conceivable hardware tier from browsers to data centers. +++
Mistral 3 family of models released
πΊ 572 pts
β‘ Score: 8.4
π¬ HackerNews Buzz: 177 comments
π BUZZING
π― EU tech scene support β’ Language model performance β’ Multilingual model capabilities
π¬ "Basically if you find yourself in this situation you're actually better of deleting the account and resigning up again under a different email."
β’ "If the claims on multilingual and pretraining performance are accurate, this is huge!"
Mistral just released Mistral 3 β a full open-weight model family from 3B all the way up to 675B parameters.
β¬οΈ 742 ups
β‘ Score: 7.9
"All models are Apache 2.0 and fully usable for research + commercial work.
Quick breakdown:
β’ Ministral 3 (3B / 8B / 14B) β compact, multimodal, and available in base, instruct, and reasoning variants. Surprisingly strong for their size.
β’ Mistral Large 3 (675B MoE) β their new flagship. Strong m..."
π¬ Reddit Discussion: 76 comments
π LOWKEY SLAPS
π― Model size range β’ Model performance β’ Model accessibility
π¬ "Leaving nothing between 14B and 675B is a really funny gap, just a giant chasm LOL."
β’ "A dense 80Bβ150B or a smaller-expert MoE in the 200B range would've hit the perfect balance between quality and feasibility."
mistralai/Mistral-Large-3-675B-Instruct-2512 Β· Hugging Face
β¬οΈ 155 ups
β‘ Score: 7.8
"Mistral just released their biggest model!!!
From our family of large models, **Mistral Large 3** is a state-of-the-art general-purpose **Multimodal granular Mixture-of-Experts** model with **41B active parameters** and **675B total parameters** trained from the ground up with 3000 H200s.
This m..."
π¬ Reddit Discussion: 49 comments
π LOWKEY SLAPS
π― Cutting-edge AI models β’ Hardware performance β’ Benchmark comparisons
π¬ "solid release: vision, nice context window, agentic, great license"
β’ "can run 4-bit DeepSeek at 350 t/s pp and 11 t/s tg with 60,000 token context size"
Ministral WebGPU: Run Mistral's new multimodal models 100% locally in your browser.
β¬οΈ 182 ups
β‘ Score: 7.5
"Today, Mistral released **Mistral 3**, a family of multimodal models, including three start-of-the-art dense models (3B, 8B, and 14B) and Mistral Large 3 (675B, 41B active). All Apache 2.0! π€ Surprisingly, the 3B is small enough to run 100% locally in your browser with WebGPU acceleration, powered b..."
π¬ Reddit Discussion: 10 comments
π BUZZING
π― Aging and Mortality β’ Technological Advancements β’ Skepticism and Goalpost Moving
π¬ "From the age of 25, one dies until one is dead."
β’ "According to him, 'reality is too complex and would need a completely different form of architecture"
New Mistral Large 3 just dropped on AWS Bedrock! Hope it will be open source...
β¬οΈ 64 ups
β‘ Score: 6.3
"External link discussion - see full content at original source."
π¬ Reddit Discussion: 18 comments
π GOATED ENERGY
π― Large language model β’ Model performance β’ Multimodal models
π¬ "673 billion parameters."
β’ "It's great that it has a vision encoder tho"