This media is not supported in your browser
VIEW IN TELEGRAM
Google just introduced Gemini 1.5 Pro, their new model that uses a Mixture-of-Experts (MoE) approach for more efficient training & higher-quality responses.
Gemini 1.5 Pro, mid-sized model, will soon come standard with a 128K-token context window, but starting today, developers and customers can sign up for the limited Private Preview to try out 1.5 Pro with a groundbreaking and experimental 1 million token context window!
The 1M tokens feature unlocks huge possibilities for devs - upload hundreds of pages of text, entire code repos, and long videos and let Gemini reason across them.
1M Token is đŸ¤¯
#Google #Gemini #AI
@Dagmawi_Babi
Gemini 1.5 Pro, mid-sized model, will soon come standard with a 128K-token context window, but starting today, developers and customers can sign up for the limited Private Preview to try out 1.5 Pro with a groundbreaking and experimental 1 million token context window!
The 1M tokens feature unlocks huge possibilities for devs - upload hundreds of pages of text, entire code repos, and long videos and let Gemini reason across them.
1M Token is đŸ¤¯
#Google #Gemini #AI
@Dagmawi_Babi