Mistral has introduced Medium 3.5, its new flagship dense AI model that consolidates chat, reasoning, coding, and agentic functions within a single system. Unlike its previous approach of deploying separate specialized models, Mistral now presents unified reasoning capabilities, replacing Medium 3.1, Magistral, and Devstral 2 in core products like Le Chat and the Vibe CLI. Medium 3.5 employs a dense architecture with 128 billion parameters and features a 256,000-token context window. This design allows it to handle extended documents, large codebases, and long multi-step workflows. While the industry is increasingly moving to Mixture of Experts models for inference efficiency, Mistral’s decision to use a dense model supports straightforward deployments but demands more hardware, namely four GPUs for self-hosting—an option mostly suited to teams and data centers. The model adds a reasoning_effort parameter, permitting users to toggle between faster or more rigorous outputs per query. Fo...
Related
Kirki
WordPress finally has a freeform canvas website builder. Discussion | Link
Raybeam
A better way to screen share on macOS Discussion | Link
Google is shrinking free storage from 15GB to 5GB unless you add a verified phone number
Google has reportedly introduced a new policy for free storage on newly created accounts. Some users have reported that newly created accounts now start with only 5GB, instead of t...