Just completed the DeepLearning.AI Natural Language Processing Specialization. While I appreciate the review of foundational knowledge gained from exploring historical approaches like BOW, HMMs and LSTMs, I would rather spend more time on the cutting-edge advancements in NLP, particularly delving deeper into the capabilities of Hugging Face Transformers. All those certifications are fun, but I always feel let down because they are either too much from scratch, not professionalizing enough, or not focused on the current state-of-the-art. What are your thoughts? #LLM #NLP #deeplearning
you gotta dig into git and arxiv to really stay on the edge...
AI/ML Researcher | Astrophysics
7moCongrats on your certification. In a time where there's a lot of hype about LLMs and AI in general, I have the feeling most people are in hurry to learn the use cases/applications/cutting-edge advancement without making any effort to understand what's happening behind the hood. I find understanding the basics very important (which I think you do too) for budding data scientist. I would rather spend some time on the basics which I think Deeplearning.AI and Coursera have you covered on that. Having said all that, I think the field is evolving so fast that by the time, these certifications are out, they're almost obsolete in no time. I couldn't agree more that some of the online courses promise heaven and don't live up to their salt. Deeplearning.AI offers some free short courses on the current state of the art. You should definitely check those out.