Aniket Mishra’s Post

View profile for Aniket Mishra, graphic

AI Enthusiast | Low-Level Programming | Passionate About Building from Scratch

This week, I focused on studying neural networks from the ground up. Building a neural network from scratch without using any libraries was an amazing experience. It made me realize how simpler units can combine into a complex structure that can solve significant problems. Here’s a summary of the learning process: 1. Initialize a neural network by assigning random values to weights and biases. 2. Forward pass : Given an input, determine the predicted value and the loss from the neural network. 3. Backpropagation : Calculate the derivative of the loss with respect to all the weights and biases. 4. Update weights and biases : Use the derivatives obtained in the previous step to adjust the weights and biases to reduce the loss function. 5. Repeat steps 1 to 4 until the loss is minimized. Special thanks to Andrej Karpathy for his incredible video on neural networks. Source Code: https://lnkd.in/dEptQmZv Video Link: https://lnkd.in/dPN_dshw #ml #AI #datascience #neuralnetwork #deeplearning

The spelled-out intro to neural networks and backpropagation: building micrograd

https://www.youtube.com/

To view or add a comment, sign in

Explore topics