Mixture of Experts - AshokBhat/ml GitHub Wiki
About
- Mixture of Experts (MoE)
- A machine learning architecture
- Combines the predictions of multiple "expert" models to produce a final output.
- Each expert specializes in a particular aspect of the data or problem domain.
- Dynamically selects which expert to use based on the input data.
Architecture
+-------------------+
| Input Data |
+-------------------+
|
v
+-----------------+
| Gating |
| Network |
+-----------------+
/ \
/ \
v v
+---------------+ +---------------+
| Expert 1 | | Expert 2 |
| (Specialized) | | (Specialized) |
+---------------+ +---------------+
\ /
\ /
v v
+-----------+
| Output |
+-----------+
See also