Dagmawi Babi
Mixtral 8x7B was released a few days ago. It's an open weight mixture of experts model. Mixture of experts is a model where different parts specialize in different tasks, and they're combined to make predictions. Mixtral matches or outperforms LLAMA 2 70B…
Mixtral now powers Leo, Brave browser's AI assistant. 🔥
#Mixtral #Brave #Leo #AI #Browsers
@Dagmawi_Babi
#Mixtral #Brave #Leo #AI #Browsers
@Dagmawi_Babi
🔥3
January 27, 2024