Dagmawi Babi
6.43K subscribers
14.8K photos
1.96K videos
231 files
2.06K links
Believer of Christ | Creative Developer.

Files Channel: https://t.me/+OZ9Ul_rSBAQ0MjNk

Community: @DagmawiBabiChat
Download Telegram
xAI's Grok LLM has been open sourced

They released the base model weights and network architecture. It also shows that the model has 314 billion parameters and is a mixture of experts model. And it is not fine tuned alot. So chatting with it needs a bit of a config and fine tuning.

For a 314B parameter model, it is very undertrained. The benchmarks show that the model is of the GPT 3.5 level. GPT 3.5 is estimated to be a 20B parameter model.

Repo
https://github.com/xai-org/grok-1

Weights
https://huggingface.co/xai-org/grok-1/tree/main/ckpt

#Grok #LLM #AI #ML
@Dagmawi_Babi