Dagmawi Babi
6.43K subscribers
14.8K photos
1.96K videos
231 files
2.06K links
Believer of Christ | Creative Developer.

Files Channel: https://t.me/+OZ9Ul_rSBAQ0MjNk

Community: @DagmawiBabiChat
Download Telegram
Media is too big
VIEW IN TELEGRAM
I've been spending the past few days learning Machine Learning, Neural Nets and AI stuff.

And this's the first properly working project I did. It's basically equivalent to GPT-2 but horribly optimized and specificly training on one material.

So I wrote this neural net, line by line, understanding it all, and finally trained it on all of Shakespeare's works to generate infinitely more Shakespeare lines.

#AI #NeuralNets #MyProjects #GPT
@Dagmawi_Babi
Media is too big
VIEW IN TELEGRAM
And finally based on what I learned so far, I changed up some parameters and trained the neural network on the King James Version of the Bible.

Unlike the Shakespeare model which uses 32 blocks of characters to predict the next character in the sequence. This uses 100 characters to predict the next character, and at the same time doing 50 parallel trainings at the same time instead of 16.

It is currently training like this for 10,000 times to hopefully predict a 500 character long "bible verse".

I tried running this model on my PC but took forever. So I'm running it in the cloud and it's still taking so long, almost an hour and it's only 3000 training rounds in.

Will update you on what it generates.

#AI #NeuralNets
@Dagmawi_Babi
So to put things in perspective

[ Pic 1 ]
What my model generated after just 100 rounds of training, using 32 blocks of characters each time to predict the next character to generate the 500 charactee long Shakespeare quote.

[ Pic 2 & 3 ]
What my neural net produced after training for 10,000 times. With the same 32 long block of characters to predict each sequence of characters.

[ Summary ]
So you can see how lengthy trainings affect the output of the neural net. It basically went from trying to make normal sentences to actually being legible, make sense and even mimic the style of Shakespeare.

#AI #NeuralNets
@Dagmawi_Babi
Update on training my neural net on the King James Version of The Bible.

Training at 4000th round.
100 characters at once,
50 sets of 100 characters in parallel.

Now I'm just worried that I didn't give it more training rounds since the content is just so much more than just Shakespeare's works. To compensate for that I did try to make the context 100 characters long. So we'll see.

So I'm just scared that it might not generate pretty specific and mind blowing results. But I could be wrong.

#AI #NeuralNets
@Dagmawi_Babi
Media is too big
VIEW IN TELEGRAM
This is the result of my neural net training on The King James Version of The Bible. πŸ˜…πŸ€―

#AI #NeuralNets
@Dagmawi_Babi
With no further Ado

Here are some original AI generated bible verses and chapters.

#AI #NeuralNets
@Dagmawi_Babi
Here are some more legible and slightly more coherent AI generated bible verses.

I tried making the output a bit legible and tried making the model generate entire chapters at once, with each verse having a random token length from 100 to 250 characters. And each chapter having around 10 verses

So this's prolly the best 10,000 training rounds and 100 contextual references get.

#AI #NeuralNets
@Dagmawi_Babi
I've now started training the models 100,000 rounds and the ShakespearAI will use 500 characters to predict new characters while the KJVBibleAI one will use 1000 characters 🀯

It's gonna take so long, hopefully it finishes by tomorrow and shows us much more coherent and sensible verses. 😊

Will show you results tmo. ❀️
Good night πŸŒ‰

#AI #NeuralNets
@Dagmawi_Babi
Also about the 100,000 rounds of training i set for the KJVBibleAI, Woke up to find out that Google's Collab stopped the training.

So redoing it again.

#AI #NeuralNets
@Dagmawi_Babi
Dagmawi Babi
Also about the 100,000 rounds of training i set for the KJVBibleAI, Woke up to find out that Google's Collab stopped the training. So redoing it again. #AI #NeuralNets @Dagmawi_Babi
So Google Colab has been running so slow to train my neural net model. Been taking forever and it's also been shutting down and what not.

So I just left it alone and just spent the day learning about other ML concepts.

And it was just now while I was browsing inside Google Colab that I found that I could've used a GPU or a TPU.πŸ€¦β€β™‚

So now I'm watching it train 10k rounds in like 10 minutes which took about 40 minutes on CPU.

Well I guess I can finally train my model at 100k rounds and 1000 token blocks.

#AI #NeuralNets
@Dagmawi_Babi
trainlog.log
94 KB
So you browse through the training log and see it form sentences at each interval. And then sound more biblical. And more specifically KJV.

#AI #NeuralNets
@Dagmawi_Babi
Someone from our community asked me to try it on Amharic Content, and so here it goes.

I found an Amharic News Headlines dataset and formatted it a bit to make it trainable for my Neural Net. I got lazy so I didn't remove some symbols and stuff but this'll do.

So I've started training on a 175MB text file of Amharic News Paper Headlines.

I'm keeping a log file so we can see how it's improving even if it stops. I'm running 100k rounds, uses 600 block of characters to predict each coming character and 16 parallel GPU computations to make it a bit faster.

Will update you on the result.

#AI #NeuralNets
@Dagmawi_Babi
amharictrain.log
36.4 KB
So you can browse through the training log and see it form sentences at each interval. And then sound more sensible and then become more News-Like

#AI #NeuralNets
@Dagmawi_Babi
Dagmawi Babi
I've been spending the past few days learning Machine Learning, Neural Nets and AI stuff. And this's the first properly working project I did. It's basically equivalent to GPT-2 but horribly optimized and specificly training on one material. So I wrote this…
I was using Andrej Karpathy's lessons when I made my GPT a couple months back. It's such a simple idea but so effective. It's just wild even when I wrote it and understood it all from scratch to think that this was possible.

My GPT trained on Shakespeare's works
β€’ https://t.me/Dagmawi_Babi/9879

My GPT trained on KJV Bible
β€’ https://t.me/Dagmawi_Babi/9879

My GPT spitting Original KJV Verses
β€’ https://t.me/Dagmawi_Babi/9909

My GPT trained on Amharic News
β€’ https://t.me/Dagmawi_Babi/9954

πŸ”₯πŸ”₯πŸ”₯

#ML #AI #GPT #NeuralNets
@Dagmawi_Babi
Please open Telegram to view this post
VIEW IN TELEGRAM