Well simplified, explained and complete video playlist for anyone who's interested to better understand the mathematical fundamentals that are powering a lot of the recent AI models. It starts off with the basics of a Neural Network and finishes up talking about Transformer models & Attention mechanisms. https://lnkd.in/gVBzk6Wt
Abie Cohen’s Post
More Relevant Posts
-
If you want a good, high-level introduction into neural networks, transformers and how AI actually works, I highly recommend these videos from 3Blue1Brown: https://lnkd.in/d6C2Qjc4
Neural networks
youtube.com
To view or add a comment, sign in
-
It seems a lot of people want a primer on AI. So here's a really good series of videos by a strong maths fellow explaining things in a very simple, very clear way. https://lnkd.in/ezHh4dct Three Blue One Brown, in case you're not familiar, does a lot of really good deep dives on all kinds of math problems, Tranformers, neural networks, deep learning, etc being only one small part of the channel's content. It will also give you a clue as to why I say machines are thinking now. A lot of people have simplified their understanding of LLMs as "next token predictors" which masks what's going on there. Markov chains are next token predictors. Transformers are, yes, at the end giving a list of tokens and the likelihood of each, but to get to that point requires a lot of sophisticated thought. To grok that, watch this video series and get a sense of the complexity that's hidden underneath transformers. This does not go into diffusion models which are thinking machines in a different sense. If I find a good video series on those, I'll post it.
Neural networks
youtube.com
To view or add a comment, sign in
-
PyTorch IntroductionTraining a Computer Vision Algorithm In this post of the PyTorch Introduction, we’ll learn how to train a computer vision model using a Convolutional Neural Network with…Continue reading on Towards Data Science »... https://lnkd.in/euvcdtYw #AI #ML #Automation
PyTorch IntroductionTraining a Computer Vision Algorithm
openexo.com
To view or add a comment, sign in
-
What Are 3 Examples Of Neural Network? #NeuralNetworks #MachineLearning #ArtificialIntelligence #AI #DeepLearning #TechInnovation #DataScience #AIApplications #TechTrends #FutureTech #Computing #TechExplained #AIResearch #SmartTech #Innovation https://lnkd.in/ebXqe4eD
What Are 3 Examples Of Neural Network?
https://www.omniraza.com
To view or add a comment, sign in
-
One of the most important things that will help us "wrap our heads" around generative AI - and ALL forms of signal-based (rather than symbolic) AI - is to get clear on the fundamental notions that support each of the major AI & machine learning areas. Three big ones concepts or underlying disciplines stand out for each area where we've had major breakthroughs and advances: - Latent variables, - Bayesian (conditional) probabilities, and - Statistical mechanics. This YouTube shows how they all contribute - and how each of the major neural networks-type AIs evolved from insights building on these fundamentals. #ai #artificialintelligence #agi #artificialgeneralintelligence #aieducation #machinelearning #generativeai https://lnkd.in/gfKkczhE
Key Concepts for a New Class of Neural Networks
https://www.youtube.com/
To view or add a comment, sign in
-
A great series if you want to learn more about Neural Networks, incl. GPT and transformers. It's not too detailed and reasonably "easy" to understand - https://lnkd.in/d7Xx4Z9R
Neural networks
youtube.com
To view or add a comment, sign in
-
The next generation of AI - artificial GENERAL intelligence (AGI) - will use more complex and interwoven methods, whereas the current methods largely rely on replicating simple elements. This is expressed in this YouTube on "Key Concepts for a New Class of Neural Networks." #ai #agi #artificialgeneralintelligence #artificialintelligence #machinelearning #generativeai #aieducation
One of the most important things that will help us "wrap our heads" around generative AI - and ALL forms of signal-based (rather than symbolic) AI - is to get clear on the fundamental notions that support each of the major AI & machine learning areas. Three big ones concepts or underlying disciplines stand out for each area where we've had major breakthroughs and advances: - Latent variables, - Bayesian (conditional) probabilities, and - Statistical mechanics. This YouTube shows how they all contribute - and how each of the major neural networks-type AIs evolved from insights building on these fundamentals. #ai #artificialintelligence #agi #artificialgeneralintelligence #aieducation #machinelearning #generativeai https://lnkd.in/gfKkczhE
Key Concepts for a New Class of Neural Networks
https://www.youtube.com/
To view or add a comment, sign in
-
Our soc med guy steps in front of the camera for the first time! For a primer on AI types. There's a lot of alphabet soup going around. LLMs, CNNs, GANs—if you're confused what it all means, this is for you. Stay tuned for Part 2 :) Nerdy note: Lots was simplified to fit in a 1 minute video. An oversimplification is that image generation AIs like Midjourney can be called Convolutional Neural Networks. They use CNN models for part of their construction, but they more commonly referred to as Diffusion models, since this is the final step of the process. Diffusion mean gradually refining an image starting from random noise. ◻️ ➡️ 🌇 We might do a deeper dive on diffusion later. Let us know if you want more formats like this.
To view or add a comment, sign in
-
What if I told you Artificial Intelligence was just math? Well, Neural Networks, a fundamental component of AI, is somewhat like a giant mathematical model. Now you're probably expecting a giant, incomprehensible expression with more greek symbols and letters than numbers. Which is true, to an extent, but with a little work, it can be a lot simpler than that. If we look closer, and deeper, into the Perceptrons, made of Layers, made of Neurons, made of simple mathematical transformations of a single input, we find that on an atomic level, the simple operations involving inputs, weights, biases and activations are humanly understandable. Of course, in the modern world of computing, abstraction can be both a blessing and a curse. One may turn to Pytorch for its tensors and nth-dimenstional transformation methods, but if one truly wishes to break down the "black boxes" of Neural Nets, they must build it themselves. Step by step. From scratch. That being said, a good way to do this is with Andrej Karpathy's atomic-to-multicellular introduction to Neural Networks and Backpropogation: https://lnkd.in/ezAMbXXW
The spelled-out intro to neural networks and backpropagation: building micrograd
https://www.youtube.com/
To view or add a comment, sign in
-
Unlock the Power of Convolutional Neural Networks! 🚀 As a data enthusiast, I'm always eager to dive deeper into the mechanics of cutting-edge technologies. Recently, I stumbled upon an exceptional resource that demystifies Convolutional Neural Networks (CNNs) in a visually stunning way. If you're a fellow visual learner, you'll appreciate the clarity and precision of this explanation. Take your understanding of CNNs to the next level and discover the secrets behind their remarkable image recognition capabilities! A huge thank you to the creator of this resource for making complex concepts accessible and engaging! Link: https://lnkd.in/gbFsF5X7 #AI #MachineLearning #DataScienceCommunity
CNN Explainer
poloclub.github.io
To view or add a comment, sign in