Mixture of Experts Models - BKJackson/BKJackson_Wiki GitHub Wiki

NanoMoE Github - An extension of the nanoGPT repository for training small MOE models. Cameron Wolfe
NanoMoE Blog Post - related substack post