Moondream
The Open Source VLM That Runs Everywhere.
Over 6 million downloads!
Explore the lineup.
Explore the lineup.
Explore the lineup.
Moondream 2B
Moondream 2B
Moondream 2B
Powerful and fast
1.9B
Parameters
1.9B
Parameters
1.9B
Parameters
fp16, int8, int4
Quantized
fp16, int8, int4
Quantized
fp16, int8, int4
Quantized
2GiB
Memory
2GiB
Memory
2GiB
Memory
Quantized Aware Training
Training
Quantized Aware Training
Training
Quantized Aware Training
Training
Servers, PC, Mobile
Target Devices
Servers, PC, Mobile
Target Devices
Servers, PC, Mobile
Target Devices
GPU, CPU-Optimized
Inference
GPU, CPU-Optimized
Inference
GPU, CPU-Optimized
Inference
Apache 2.0
License
Apache 2.0
License
Apache 2.0
License
New
Moondream 0.5B
Moondream 0.5B
Moondream 0.5B
Tiny and speedy.
0.5B
Parameters
0.5B
Parameters
0.5B
Parameters
int8, int4
Quantized
int8, int4
Quantized
int8, int4
Quantized
1GiB
Memory
1GiB
Memory
1GiB
Memory
Quantized Aware Training
Training
Quantized Aware Training
Training
Quantized Aware Training
Training
Mobile, Edge
Target Devices
Mobile, Edge
Target Devices
Mobile, Edge
Target Devices
GPU, CPU-Optimized
Inference
GPU, CPU-Optimized
Inference
GPU, CPU-Optimized
Inference
Apache 2.0
License
Apache 2.0
License
Apache 2.0
License
Discover the capabilities.
Get started in 5 minutes.
Get started in 5 minutes.
Get started in 5 minutes.
Our clients are optimized for CPU and GPU inference, and are a snap to learn.
Our clients are optimized for CPU and GPU inference, and are a snap to learn.
pip install moondream
What our fans say
What our fans say
More testing of the amazing Moondream open source multimodal LLM today! It is massively small: 1.6B parameter model built using SigLIP, Phi-1.5 and the LLaVA training dataset. I am really impressed. More soon.
Brian Roemmele
@BrianRoemmele
Moondream: a 1.6 Billion parameter model that is quite effective and possibly able to go toe to toe with the bigger models in the future.
MasteringMachines AI
@MstrMachines
MoonDream - A tiny vision language model that performs on par w/ models twice its size by @vikhyatk. Its so fast, you might not even catch it streaming output!
Luis C
@lucataco93
moondream is *wicked* fast.
CJ
@cj_pais
More testing of the amazing Moondream open source multimodal LLM today! It is massively small: 1.6B parameter model built using SigLIP, Phi-1.5 and the LLaVA training dataset. I am really impressed. More soon.
Brian Roemmele
@BrianRoemmele
Moondream: a 1.6 Billion parameter model that is quite effective and possibly able to go toe to toe with the bigger models in the future.
MasteringMachines AI
@MstrMachines
MoonDream - A tiny vision language model that performs on par w/ models twice its size by @vikhyatk. Its so fast, you might not even catch it streaming output!
Luis C
@lucataco93
moondream is *wicked* fast.
CJ
@cj_pais
First small language model I've seen that has proper vision capabilities
Tom Dörr
@tom_doerr
moondream is *wicked* fast.
CJ
@cj_pais
MoonDream - A tiny vision language model that performs on par w/ models twice its size by @vikhyatk. Its so fast, you might not even catch it streaming output!
Luis C
@lucataco93
Moondream: a 1.6 Billion parameter model that is quite effective and possibly able to go toe to toe with the bigger models in the future.
MasteringMachines AI
@MstrMachines
More testing of the amazing Moondream open source multimodal LLM today! It is massively small: 1.6B parameter model built using SigLIP, Phi-1.5 and the LLaVA training dataset. I am really impressed. More soon.
Brian Roemmele
@BrianRoemmele