Just finished reading the paper, Attention is All You Need, that introduced transformer neural network models that gave rise to all the LLMs & gen AI we know of today.
• https://t.me/c/1156511084/838
Astounding how clever some people are. Took me a bit of a while to really understand but in it's most basic form; Long ago neural networks and ML training was held linearly and consequently. Thanks to transformers ML algorithm can be trained in parallel. 🤯
It's basically like multi-threading but for neural nets.
#Papers #AttentionIsAllYouNeed
@Dagmawi_Babi
• https://t.me/c/1156511084/838
Astounding how clever some people are. Took me a bit of a while to really understand but in it's most basic form; Long ago neural networks and ML training was held linearly and consequently. Thanks to transformers ML algorithm can be trained in parallel. 🤯
It's basically like multi-threading but for neural nets.
#Papers #AttentionIsAllYouNeed
@Dagmawi_Babi