****************Meta Llama 4 open multimodal MoE family + upcoming opens****************
Key Questions
What are the main models in the Llama 4 family?
Llama 4 Scout and Maverick are native multimodal MoE models with 17B active parameters, using 16/128 experts for image-text support. They are available on HF under meta-llama.
What fine-tuning options exist for Llama 4?
NeMo fine-tunes are available for Llama 4, including NeMo-AutoModel supporting interleaved multimodal inputs. This enhances customization for various tasks.
What upcoming open-source plans does Meta have?
Meta plans to open-source Avocado LLM and Mango multimodal models, though safety-limited. Reports confirm open-source versions of next-generation AI models.
What is the architecture of Llama 4 Scout/Maverick?
They feature Mixture of Experts (MoE) with 17B active parameters and 16/128 experts. This is Meta's first natively multimodal model family.
What is the development status of Llama 4?
The Llama 4 family is in developing status. Axios scoops and reposts by experts like Miles Brundage highlight Meta's commitment to open-sourcing.
Llama 4 Scout/Maverick (17B active MoE, 16/128 experts) native image-text on HF meta-llama; NeMo fine-tunes; plans to open-source Avocado LLM/Mango multimodal (safety-limited).