Brainsteam

Home

❯

AI

❯

OLMOE

OLMOE

Jun 30, 20251 min read

  • ai/models
  • ai/moe
  • ai/llm

models moe llm

OLMOE is a mixture of experts model from AllenAI.

Github Repo Paper

7B model - 1.3b active parameters. Performs as well as Gemma2 3b but using open/free data and 1/3 of the active parameters.


Graph View

Created with Quartz v4.5.1 © 2025

  • Blog
  • Mastodon
  • Bluesky