Ministral 3B
Ministral 3B is Mistral AI's ultra-compact 3-billion parameter model, delivering state-of-the-art performance in knowledge retrieval, commonsense reasoning, function calling, and multilingual tasks on edge devices like smartphones. With a 128K context window and native multimodal capabilities, it powers efficient on-device AI for agentic workflows, automation, and low-latency applications—all under an open Apache 2.0 license.
Available for Chat, Vision, and File Uploads.
Performance Benchmarks
How do you want to interact?
Start a Conversation
Ask anything.
Have a natural conversation, brainstorm ideas, draft emails, or ask for advice.
Use a Persona
Specialized Experts.
Instruct the AI to act as a Coding Tutor, Marketing Expert, or Travel Guide.
Why use Ministral 3B?
Multimodal Processing
Processes text and images for tasks like captioning, VQA, and OCR
Function Calling
Supports native function calling and structured JSON for agentic workflows
Long Context Handling
Supports up to 256k token context for extended reasoning and conversations
Capability Examples
Multimodal Image Captioning
Function Calling for Task Routing
How to use
Go to Chat
Navigate to the "AI Chat" page.
Select Model
Ensure Ministral 3B is selected.
Type Prompt
Ask a question or paste code.
Interact
Refine the answer by replying to the AI.
Compare LLMs Side-by-Side
Is Ministral 3B better than Claude 3.5 or Gemini? Test same prompts simultaneously in the Chat Playground.
Open Chat PlaygroundMade with ❤ by AI4Chat