Announcement_012

OLMoE is out! Our first mixture of experts model in the OLMo family 🎉 OLMoE has only 1B active params but matches perf of larger dense models 🫨 and comes released with: ✅ weights ✅ data ✅ code ✅ ckpts ✅ logs ✅ detailed paper! Download the weights here and read the paper here!




Enjoy Reading This Article?

Here are some more articles you might like to read next:

  • a post with tabs
  • a post with typograms
  • a post that can be cited
  • a post with pseudo code
  • a post with code diff