Dagmawi Babi
6.65K subscribers
15.4K photos
2.08K videos
241 files
2.18K links
Believer of Christ | Creative Developer.

Files Channel: https://t.me/+OZ9Ul_rSBAQ0MjNk

Community: @DagmawiBabiChat
Download Telegram
xAI's Grok LLM has been open sourced

They released the base model weights and network architecture. It also shows that the model has 314 billion parameters and is a mixture of experts model. And it is not fine tuned alot. So chatting with it needs a bit of a config and fine tuning.

For a 314B parameter model, it is very undertrained. The benchmarks show that the model is of the GPT 3.5 level. GPT 3.5 is estimated to be a 20B parameter model.

Repo
https://github.com/xai-org/grok-1

Weights
https://huggingface.co/xai-org/grok-1/tree/main/ckpt

#Grok #LLM #AI #ML
@Dagmawi_Babi