Want to be a Developer Advocate or Technical Evangelist? So, now you've decided to really dive deep into Technical stuff. You are all geared up to work closely with Developers who are creating cutting edge apps around the world. How do you ensure you are in touch with what they are trying to do and how do you become the bridge between developers and customers? Joining communities where developers discuss their pain points, especially in the AI, LLM, and GenAI domains, is a great way to stay informed and collaborate more effectively. Here are some of the best forums and communities you can consider joining: Stack Overflow: For technical Q&A and real-world problem-solving. Reddit (r/MachineLearning, r/ArtificialIntelligence, r/LanguageTechnology): For discussions and industry updates. GitHub: For project collaboration, code reviews, and issue tracking. Kaggle: For data science competitions, discussions, and sharing solutions. Hugging Face Forums: For specialized NLP and machine learning conversations. AI Alignment Forum: For discussions on aligning AI systems with human values. OpenAI Community: For insights and discussions related to OpenAI’s tools and technologies. LinkedIn Groups: For professional networking and industry-specific discussions. Data Science Central: For articles, tutorials, and community discussions. ResearchGate: For academic publications and research-focused discussions.
Ananda Gouda’s Post
More Relevant Posts
-
🔥 GraphRAG: A graph-based RAG framework introduced by Microsoft aims to enhance the traditional RAG framework by constructing a dynamic knowledge graph from a given dataset. 👉 This graph serves as a structured representation of information, capturing not only the data itself but also the relationships and context between different pieces of information. 👉 It works by constructing a dynamic knowledge graph from raw text, capturing essential entities, relationships, and key claims within the data. 👉 However, this added depth comes with higher computational costs, as GraphRAG requires multiple calls to large language models (LLMs), leading to increased token usage. Learn more about use-cases and limitations of this framework in the community writeup on GraphRAG by Akash Desai https://lnkd.in/gSvCnsVB
To view or add a comment, sign in
-
Databricks announced its acquisition of the team behind Einblick, the natural language experts providing an AI-native data notebook that goes beyond writing and fixing code. Einblick’s CEO and co-founder Emanuel Zgraggen said, “Today marks a significant milestone as we join forces with Databricks to further our mission. I am thrilled to embark on this new journey with a company whose mission is to simplify and democratize data and AI." Click below to read more about the acquisition and what’s next for Databricks. Databricks is a holding in The Private Shares Fund as of 12/31/23. To view the full portfolio and to learn more about The Private Shares Fund, please visit https://lnkd.in/gTeAzvq8 #AI #naturallanguage #AInative
The Einblick team joins Databricks! Founded by researchers from MIT and Brown University, Einblick provides an AI-native collaboration platform that helps users solve data problems with just one sentence. Their innovations are a perfect complement to the Databricks Data Intelligence Platform. We’re excited to further our mission of using natural language to bring insights to users by incorporating the AI-native approach of Emanuel Zgraggen, Philipp Eichmann and the Einblick team. https://dbricks.co/48UydrW
Welcome to the Data Intelligence Platform: Databricks + Einblick
databricks.com
To view or add a comment, sign in
-
So excited about this partnership - Meta to release the Llama 3.1 series of models on Databricks – available to customers today. These models expand context length to 128K, add multilingual capabilities, and include Llama 3.1 405B — the largest and highest-quality openly available foundation model. With this release, customers can serve and fine-tune the Llama 3.1 models, connect them to our RAG and agentic systems, easily generate synthetic data to better customize their models, and leverage them for scalable evaluation. Databricks has integrated Llama 3.1 throughout our Data Intelligence Platform.
A New Standard in Open Source AI: Meta Llama 3.1 on Databricks
databricks.com
To view or add a comment, sign in
-
Meta's highest quality open source foundation model, Llama 3.1 series, is available to customers on Databricks today. These models expand context length to 128K, add multilingual capabilities, and include Llama 3.1 405B — the largest and highest-quality openly available foundation model. With this release, customers can serve and fine-tune the Llama 3.1 models, connect them to our RAG and agentic systems, easily generate synthetic data to better customize their models, and leverage them for scalable evaluation. Databricks has integrated Llama 3.1 throughout our Data Intelligence Platform.
A New Standard in Open Source AI: Meta Llama 3.1 on Databricks
databricks.com
To view or add a comment, sign in
-
🚀 Thrilled to share my latest Generative AI project! 🚀 In today’s fast-paced industry, many professionals face challenges when interacting with databases due to limited knowledge of SQL. To solve this, I’ve developed an AI-powered tool that transforms natural language into SQL queries, making it effortless for users to retrieve data from databases without needing any SQL expertise! ✨ Using the powerful Google Gemini model, this solution bridges the gap between technical complexity and everyday accessibility. Now, you can simply ask a question in plain English, and the AI does the heavy lifting, generating the SQL query and returning the data—no coding required! This project is a step towards making data access more intuitive, empowering non-technical users to work with databases like a pro. 🔥 Check out the code on GitHub: https://lnkd.in/g3wrT3Rx Excited to explore more ways AI can simplify complex processes and boost productivity across industries! 🌟
To view or add a comment, sign in
-
✨ This Week from Artium’s Slack ✨ Introducing Llama 3.1: Our most capable models to date (https://lnkd.in/g-3nFYeF) “Llama 3.1 update out. New 405 billion parameter. 128k context window default.” - Randy Lutcavich, Principal Software Engineer — OpenAI hits Google where it hurts with new SearchGPT prototype (https://lnkd.in/dexz5ZpD) “OpenAI enters the search game.” - cauri jaye, Director of AI — Open Source AI Is the Path Forward (https://lnkd.in/g9JHmiH6) “Great blog post from Zuckerberg on the importance of open source models.” - Ross Hale, CEO — AI Assistant Manager (https://lnkd.in/e2EsS8bd) “Moved some code to PyPI that I have been sharing between various OpenAI Assistants I have." - Justin Beall, Software Engineer — Short Course - Federated Learning (https://lnkd.in/eh4wUNc5) "Great course on federated learning. Federated learning allows a single model to be trained across multiple devices, such as phones, or multiple organizations, such as hospitals, without the need to share data to a central server. For enterprises concerned with data and privacy, we can use federated learning to train a variety of models, ranging from speech and vision models to LLMs, across distributed data while offering data privacy options to users." - cauri jaye, Director of AI
To view or add a comment, sign in
-
Hey everyone! After some months of work, coffee, and a bit of AI magic, I wanted to introduce my newest project – the Data Science Assistant Chatbot. This isn't just any chatbot; it's your personal data science buddy, powered by the brainy OpenAI's GPT-3.5. 🤖 Why I Built It? The idea sparked from my own late-night data science struggles. I wished for a friend to instantly answer my burning questions without scrolling through endless forums. So, I decided to build one! What's Cool About It? Ask Anything Data Science: From "What's a p-value?" to "How do I train a neural network?" – it's got you covered. Smart & Conversational: It's like texting a friend who happens to be really good at data science. Easy-Peasy to Use: A simple interface built with Gradio that feels welcoming. Because let's face it, data science can be daunting enough. Let's Make It Even Better Together: I'm proud of where it's at but even more excited about where it could go with your input. Try it out, throw tricky questions at it, and let me know what you think! Curious? I'm leaving the code open-sourced in case you want to test it! #DataScience #AI #MachineLearning #OpenAI #TechCommunity https://lnkd.in/dVfqyB9H
GitHub - A1fred00-datascience/Data_Science_Chatbot: The Data Science Assistant Chatbot is an AI-driven interactive application designed to provide instant, relevant responses to various data science-related queries.
github.com
To view or add a comment, sign in
-
Got it. Let's create an engaging tweet for the "AI Related" article using the provided data context. **Step-by-Step Process:** 1. **Understand the Content:** - The article is about PgQueuer, which transforms PostgreSQL into a job queue. 2. **Target Audience:** - Developers and tech enthusiasts who are interested in AI and database management. 3. **Emotional Appeal:** - Highlight the excitement of transforming a regular database into something more powerful. - Use motivational language to inspire action. 4. **Engagement Strategy:** - Use an emoji to catch the eye. - Ask a question or give a call to action to encourage interaction. **Tweet:** "🚀 Ready to revolutionize your PostgreSQL? With PgQueuer, turn it into a powerful job queue and supercharge your workflows! 💪 Dive into the future of database management today! 🌟" This tweet is 194 characters long, including spaces and emojis, and is designed to be shareable and engaging.
To view or add a comment, sign in
-
🚀 𝐔𝐧𝐥𝐨𝐜𝐤𝐢𝐧𝐠 𝐈𝐧𝐬𝐢𝐠𝐡𝐭𝐬 𝐰𝐢𝐭𝐡 𝐀𝐳𝐮𝐫𝐞 𝐓𝐞𝐱𝐭 𝐀𝐧𝐚𝐥𝐲𝐭𝐢𝐜𝐬! 🚀 I’ve been exploring how Microsoft Azure Text Analytics API can help businesses analyze customer reviews and uncover key insights. Here’s what it can do: 𝘋𝘦𝘵𝘦𝘤𝘵 𝘓𝘢𝘯𝘨𝘶𝘢𝘨𝘦: Automatically identifies the language of the text. 𝘌𝘹𝘵𝘳𝘢𝘤𝘵 𝘒𝘦𝘺 𝘗𝘩𝘳𝘢𝘴𝘦𝘴: Highlights the most important topics. 𝘙𝘦𝘤𝘰𝘨𝘯𝘪𝘻𝘦 𝘌𝘯𝘵𝘪𝘵𝘪𝘦𝘴: Identifies key people, places, or organizations mentioned. 𝘓𝘪𝘯𝘬𝘦𝘥 𝘌𝘯𝘵𝘪𝘵𝘪𝘦𝘴: Connects entities to external resources. By processing customer reviews, businesses can gain valuable insights into sentiment, trends, and opportunities for improvement. AI is transforming how we understand and act on unstructured data! 💡 Here's the code: https://lnkd.in/gdeKWhhy #AI #Azure #TextAnalytics #MachineLearning #Python #Tech #Innovation
To view or add a comment, sign in
-
Hey everyone! 👋 Just wanted to share some exciting news. As someone who's been leveraging Databricks in my projects for a few years, I've seen firsthand how the right tools can skyrocket our productivity. 🚀 Now, there's a new player on the block - the BBRX model. I've been digging into its test results, and they're impressive, especially in the programming section! Compared to the other tools we've been using, BBRX stands out. Yet here's the big question – how does it stack up against Microsoft Co-Pilot? 🤔 With AI tech evolving daily, I'm curious to see how these two giants compare. It's all about finding that edge, right? Incorporating tools like BBRX could be a game-changer for companies looking to boost their productivity. And who doesn't want that? So, let's keep an eye on this space. Who knows? We might just be witnessing the next big leap in tech that'll make our work lives a whole lot easier. #TechNews #BBRX #DataBreaks #ProductivityTools #MicrosoftCoPilot
Announcing DBRX: A new standard for efficient open source LLMs
databricks.com
To view or add a comment, sign in
Co-Founder of Altrosyn and DIrector at CDTECH | Inventor | Manufacturer
5moEngaging with developer communities is pivotal for fostering collaboration and staying abreast of emerging trends, akin to historical gatherings where intellectuals exchanged ideas. However, with the vast array of forums available today, how do you discern which platforms offer the most valuable insights and discussions pertinent to your role as a Developer Advocate or Technical Evangelist, especially in the dynamic realms of AI, LLMs, and GenAI?