Marcel Marais’ Post

PyTorch can be extremely intimidating. It simultaneously allows for both low-level and high-level operations. The learning curve for PyTorch is steep primarily because you need to know AI fundamentals like: - Neural network architectures - Learning algorithms - Linear algebra and matrix operations While also knowing: - PyTorch-specific concepts, like its implementation of dynamic computation graphs and tensor operations - GPU acceleration and parallel computing - Memory management (for large datasets) For someone just starting out, it can be pretty intense. That’s why it’s nice to see frameworks like TinyGrad that implement the basics really cleanly, so you can focus on learning. All you really need is an autograd/tensor library and an optimizer.

Pytorch does NOT do GPU acceleration...

Robert F. Dickerson

Machine Learning Engineer

5mo

I love the level of abstraction of PyTorch. I love being able to imperatively define the forward pass myself- but sort of declaratively define the architecture- functionally even. Also I like to build the training loop myself and carefully define my loss function explicitly. Keras did too much for me. (Raw) Tensorflow on the otherhand was too low-level with so many quirks- especially Tensorflow 1.

Luís Felipe Oliveira de Melo

M.Sc. AI Engineer | Machine Learning | Deep Learning | Data Science | Data | PyTorch | ONNX | Computer Vision | C++ | Python

5mo

Lightning AI also tries to simplify the overall training pipeline with even some pretty advanced wrapped concepts (like distributed learning using a PyTorch's DDP), it's interesting for newcomers to have something running and solid to understand the basics of architecture without the need to understand all the elements of a deep learning environment

Michael Schmidt Nissen

Software developer | Cluster admin

5mo

I suppose it is a matter of preference. Personally, I find it much more insightful to implement everything from scratch, and I would recommend any beginner to do the same. Also, I would recommend beginners to ditch back-propagation entirely and focus on neuroevolution instead, which is a much more intuitive and straight-forward way to train NN's.

Like
Reply
Varun Sakunia

Python || Machine Learning || Deep Learning || Computer Vision || AI

5mo
Aliyan Anwar

Open to Cofounders for travel ecom n AI x [email protected] x FinML x Successfully transformed Data into strategy in finance and social media ranking

5mo

I agree! I started neural networks with tensorflow activation functions and partial derivatives to optimize cost function, confusing for newbies but for enthusiasts reaching intermediate level their main concern should be practical outcome from ML models and denoising the seasonality to solve the underlying problem.

See more comments

To view or add a comment, sign in

Explore topics