🎉 Exciting Update! 🎉 I'm thrilled to share that I've successfully merged my latest code update into the main branch! 🚀 I fixed a critical issue in the _process_message function where the SQL type check wasn't correctly implemented. Now, the function safely handles SQL responses, ensuring smooth and accurate query processing. Issue Explanation: The problem was with the SQL type check in the _process_message function. Initially, the function was incorrectly checking for the SQL type in the response, leading to errors during query processing. This caused issues in handling SQL responses properly and impacted the overall functionality. Solution: To resolve this, I modified the type check to use the get method for safely accessing the type key in the response dictionary. This ensures that the function correctly identifies and processes SQL responses. After making this change, the function now works seamlessly, improving the reliability and accuracy of query handling. This is my second contribution to open source, and I'm incredibly proud of this milestone. Each project brings new challenges and learning opportunities, and I'm excited to continue contributing to the community! #CodeUpdate #Programming #TechInnovations Snowflake #ProudMoment #OpenSource #CodingLife #SoftwareDevelopment
Govind Hede’s Post
More Relevant Posts
-
🚀 Excited to announce my successful completion of the Snowflake Data Applications Hands-On Essentials Workshop! 💼 During this comprehensive workshop, I honed my skills in creating intuitive UI entry forms in Streamlit to efficiently collect and manage data. I also crafted Python scripts to seamlessly insert collected data into Snowflake tables, ensuring data integrity and accuracy. Moreover, I mastered GitHub to effectively collaborate, edit, and manage code, fostering a collaborative development environment. Additionally, I delved into making REST API calls to gather data from diverse sources, facilitating streamlined data acquisition processes. Furthermore, I developed advanced Python scripts to retrieve and analyze data stored within Snowflake tables, unlocking valuable insights and enhancing decision-making capabilities. Equipped with these essential skills, I am now empowered to drive impactful data-driven solutions and contribute effectively to complex data application projects. 📊💡 Looking forward to leveraging these competencies to tackle real-world challenges and drive innovation in the realm of data applications! 🌟 #Snowflake #DataApplications #Streamlit #Python #GitHub #Certificate
To view or add a comment, sign in
-
🚀 Exciting News from the Snowflake Summit! 🚀 Snowflake has just launched a new feature: the Snowpark pandas API, allowing you to run your pandas code directly on Snowflake! Enjoy the same pandas experience with the added scalability and security of Snowflake. Here’s why this is a game-changer: 1) Seamless Integration: Just change the import statement and a few lines of code to lift and shift your pandas code to Snowflake. 2) Scalability: Work with much larger datasets without porting to other big data frameworks or using expensive machines. 3) Performance: Runs workloads natively in Snowflake through transpilation to SQL, benefiting from parallelization. 4) Familiar Interface: Offers a pandas-compatible layer that feels natural to Python developers. 5)Security & Governance: Data stays within Snowflake’s secure platform, ensuring consistent access and easier auditing. 6) No Extra Infrastructure: Leverages the Snowflake engine, eliminating the need for additional compute setup and management. check out the docs here - https://lnkd.in/gPWxZvTP Move from prototype to production seamlessly and harness the power of Snowflake with your favorite pandas workflows! 🐼✨ #DataScience #BigData #Python #Pandas #Snowflake #DataEngineering #DataSecurity #SnowflakeSummit #KipiAtSnowflakeSummit #KipiXSnowflake
To view or add a comment, sign in
-
🎉 Thrilled to share that I’ve completed the Snowflake Hands-On Essentials: Data Applications course! 🚀 This experience deepened my skills in creating entry points into Snowflake, querying data, building views and indexes, and even writing Python stored procedures and user-defined functions. I also learned how to integrate external data sources with Snowflake, enhancing my ability to work on advanced data applications. Excited to apply these new skills in future projects! 💻📊 #Snowflake ❄️ | #DataApplications | #Python 🐍 | #DataEngineering | #LearningJourney 📚
To view or add a comment, sign in
-
I completed a Snowflake quickstart on Data Engineering Pipeline with pandas on Snowflake ! Through this quickstart, I was able to showcase how to use Snowflake Notebooks, pandas on Snowflake, Git integration, and Serverless Tasks to build an end-to-end data engineering pipeline. What I Learned : ✍ How to build a Python data pipeline with Snowflake Notebooks ✍ How to aggregate and transform data to create new features with pandas on Snowflake ✍ How to version control and collaborate with Git integration ✍ How to create a Serverless Task to schedule the feature engineering #datascience #dataengineer #Snowflake #DataEngineeringTools
To view or add a comment, sign in
-
This week’s data engineering tip is dedicated to Snowflake command line interface - #Snowflake CLI. 🧑💻 ❄️ Snowflake CLI is another building block for managing #DevOps pipeline in Snowflake. Today it offers plenty of features how to interact with Snowflake in command line and automate many workflows like building CI/CD, deploying Snowpark code or apps (native, Streamlit). Originally an open source tool built by community, now official Snowflake product with open source nature inside. CLI should be extensible so in case you miss any kind of functionality, you can enhance it! 😍 I have been using CLI for deploying my Python UDFs into Snowflake as it has been super neat! I can develop the handler code in my local editor (VS Code), test it, fine-tune it and when it’s ready, just with a few CLI commands, I can bundle everything together, including automatically added dependencies in requirements.txt and deploy everything into Snowflake. 💪 Nowadays, the CLI capabilities are much wider! ✅ Interact with GIT repositories ✅ Manage Snowpark Containers ✅ Manage Snowpark Code ✅ Manage Streamlit & Native Apps ✅ Execute SQL code ✅ Manage Snowflake Objects Have you tried CLI? Let me share some links how to start in comments. 👇 #data_superhero #snowflake_advocate
To view or add a comment, sign in
-
🌟 Leveling up my Snowflake skills! ❄️ 🌟 I’m thrilled to share that I’ve successfully completed the Hands-On Essentials Data Applications badge workshop! 🎓🏅 This journey has equipped me with the tools to create powerful data-driven applications using Streamlit, Python, REST APIs, and more. 🚀🔧 Key takeaways: 🛠️ Data Collection: Building intuitive UI entry forms in Streamlit 📊 Data Management: Inserting data into Snowflake tables 🔄 Code Collaboration: Leveraging GitHub for version control 🌐 API Integration: Making REST API calls to fetch data 🔍 Data Retrieval: Extracting valuable insights from Snowflake tables Let’s connect and explore how we can harness the power of Snowflake together! 🌟💬 #Snowflake #DataScience #DataEngineering #DataAnalytics #DataDriven #TechCareer #Certified #Achievement #Learning #Growth #LinkedIn #Badge #Workshop #HandsOn #Essentials #DataApplications #Streamlit #Python #RESTAPI #GitHub #DataCollection #DataManagement #CodeCollaboration #APIIntegration #DataRetrieval
To view or add a comment, sign in
-
I have successfully completed the Hands-On Essentials: Data Application Builders Workshop in Snowflake! 🚀 This workshop was like a full-stack developer boot camp, equipping me with various technologies to build robust applications using Snowflake as the back end. Key technologies and tools covered include: ✅ Streamlit (Python) for building interactive front-end applications ✅ SnowSQL for querying and managing Snowflake data ✅ REST APIs for seamless integration ...and much more! I'm excited to apply these skills to real-world projects and continue exploring the possibilities of Snowflake. #Snowflake #DataApplications #FullStackDevelopment #LearningJourney #DataEngineering
Hands-On Essentials: Data Application Builders Workshop • Ashish kumar Mangali • Snowflake Education Services
achieve.snowflake.com
To view or add a comment, sign in
-
Did you get your hands on Snowflake Notebooks yet? Experiment with your data, run end to end data engineering pipelines, train ML models, and build cool Streamlit apps all in one place. No more hassle of creating and managing python environments, dependencies, and more! Notebooks support: RBAC to manage access to underlying data assets Native Git integration to version control your code changes Snowflake Copilot integration One-click scheduling How to get started: 📓 Getting started with your first Snowflake Notebook: https://okt.to/ZynYX8 📓 Create and manage Snowflake objects like a pro: https://okt.to/gGSZc1 📓 Data analysis and churn prediction using Notebook: https://okt.to/dOEpUu 📓 Data engineering pipelines using Snowpark: https://okt.to/NgMEJH 📓 End to end ML with Snowpark ML: https://okt.to/pBQbwk
To view or add a comment, sign in
-
🛠️ All the tools you'll need to get started with Snowflake Notebooks. Experiment with your data, run end to end data engineering pipelines, train ML models, and build cool Streamlit apps all in one place. No more hassle of creating and managing python environments, dependencies, and more! Notebooks support: - RBAC to manage access to underlying data assets - Native Git integration to version control your code changes - Snowflake Copilot integration - One-click scheduling How to get started: 📓 Getting started with your first Snowflake Notebook: https://lnkd.in/g7AaU9-T 📓 Create and manage Snowflake objects like a pro: https://lnkd.in/gcKKu7Fh 📓 Data analysis and churn prediction using Notebook: https://lnkd.in/ga_TTeEv 📓 Data engineering pipelines using Snowpark: https://lnkd.in/gyiMB8nG 📓 End to end ML with Snowpark ML: https://lnkd.in/gcQJNMEy
To view or add a comment, sign in
-
🚀 Mastered Data Applications: Snowflake Certification Unlocked! 🚀 I’m excited to share that I’ve successfully completed the "Hands-On Essentials: Data Applications" certification from Snowflake! This milestone reflects my growing expertise in developing data-driven solutions using cutting-edge tools and technologies. Throughout this journey, I’ve honed my skills in: 🔸Streamlit: Crafting user-friendly UI forms to streamline data collection. 🔸Python: Writing robust scripts to insert and retrieve data efficiently within Snowflake tables. 🔸GitHub: Leveraging version control for seamless code management and collaboration. 🔸REST APIs: Integrating external data sources by making precise API calls. This experience has been incredibly fulfilling, fueling my passion for data engineering. I’m excited to apply these skills in future projects and continue expanding my knowledge in the field. 🌟 🔗 View My Certification https://lnkd.in/dFbMnW7d #Snowflake #DataApplications #Python #Streamlit #GitHub #DataEngineering #APIIntegration #100DaysOfBytewise #100DaysOfCode
To view or add a comment, sign in