📢 Attention Snowflake developers! There's a new kid in the Snowflake Cortex AI town - CLASSIFY_TEXT! 💡 You can use this new LLM-backed function right in your SQL queries to classify free-form text data into categories you provide. Test it and see if it works for your scenarios. ➡️ Make sure you read the docs for some guidance: https://lnkd.in/dzXsiTGn #Snowflake #SnowflakeCortexAI #LLM #GenAI #SQL
Pawel Potasinski’s Post
More Relevant Posts
-
Sometimes the best way to learn is by doing! 👩🏫 A few weeks ago, I was lucky enough to meet with one of Snowflake's Principal Software Engineers working on Cortex, Snowflake's ML service. I found the overview and demo so insightful, that I set out to summarize what I learned. With a single call to a SQL function, Cortex makes ML look easy, but the best part of this overview is deep diving into the model to understand all the complexity that actually goes into a Cortex function, like TOP_INSIGHTS! 🤯 Check it out here: https://lnkd.in/gjjAvrBw #snowflake #datascience
To view or add a comment, sign in
-
Ingesting data from your applications or database (CDC) doesn't have to be painful. Kelly Kohlleffel shared the secrets to ingesting your data sources in a few minutes with Fivetran, straight to your lakehouse & Dashboards! Getting the data to your Lakehouse has always been complicated. CDC, incremental load, initial setup, maintaining the code with all the different databases... But it doesn't have to be that hard! Fivetran makes this super easy. Kelly gives us a 101 introduction on Fivetran + Databricks, making it a no-brainer for you to get your data ready for more Data Analysis & AI. From operational databases to BI Dashboards (Lakeview) in 5 minutes, it's here with Fivetran! https://lnkd.in/eUKZnYm8
Getting started with Fivetran and Databricks
https://www.youtube.com/
To view or add a comment, sign in
-
🤖📄🖼️ 𝗨𝗻𝗹𝗼𝗰𝗸 𝘁𝗵𝗲 𝗣𝗼𝘄𝗲𝗿 𝗼𝗳 𝗠𝘂𝗹𝘁𝗶𝗺𝗼𝗱𝗮𝗹 𝗠𝗼𝗱𝗲𝗹𝘀 𝘄𝗶𝘁𝗵𝗶𝗻 𝗦𝗻𝗼𝘄𝗳𝗹𝗮𝗸𝗲 Imagine having the ability to run open-source multimodal models securely within your Snowflake environment. Say goodbye to external data transfers and hello to data privacy and security! In my new blog article I share how to host a quantized version of GLM-4V–9B, an open source SOTA multimodal model in Snowpark Container Services and build a Streamlit Application that supports images and PDFs as model inputs. Of course I also demonstrate how to use the model in Snowflake Notebooks and SQL! 📚 𝗨𝘀𝗲𝗰𝗮𝘀𝗲𝘀: • Build chatbot applications that work with images and documents • Extract attributes from images • Explain graphs in natural language ❄️ 𝗪𝗵𝘆 𝗵𝗼𝘀𝘁 𝗠𝘂𝗹𝘁𝗶𝗺𝗼𝗱𝗮𝗹 𝗠𝗼𝗱𝗲𝗹𝘀 𝗶𝗻 𝗦𝗻𝗼𝘄𝗳𝗹𝗮𝗸𝗲? • Flexibility: Use any model, accept various inputs and outputs, and forget about API limits. • Data Privacy: Keep your data within Snowflake, maintaining role-based access controls. • Cost Efficiency: No API costs, just compute power expenses. 🔗 𝗦𝗼𝘂𝗿𝗰𝗲𝘀: • Blog: https://lnkd.in/g6cHi35P • Github: https://lnkd.in/g-_xZ7Wf • Demo Video: https://lnkd.in/gHvnyQWW What are use cases you have in mind for multimodal models in Snowflake? #Snowflake #GenAI #Snowpark #Streamlit #OpenSource
Hey Snowflake, describe this image! Multimodal Models in Snowflake
https://www.youtube.com/
To view or add a comment, sign in
-
📖 Did you know that you can run SQL queries directly on your DataFrames without creating a temporary view in Spark? 🔖 Before Spark 3.4 version, if you wanted to use SQL with your DataFrames, you'd have to take an extra step: df = spark.read.csv("/path/to/my/file.csv") df.createOrReplaceTempView("my_table") spark.sql("SELECT * FROM my_table").show() 🔖 But since Spark 3.4, you can streamline your code like this: df = spark.read.csv("/path/to/my/file.csv") spark.sql("SELECT * FROM my_table", my_table=df).show() Just pass your DataFrame as a parameter to spark.sql(), and you're all set. It's a small change that makes your code cleaner and saves time, especially when working with multiple DataFrames. #data #ai #dataengineering #databricks #spark #sql #bigdata
To view or add a comment, sign in
-
Tech companies can't exfiltrate their data fast enough to services like Snowflake and Databricks to get all those good ML and AI driven BI insights. And then the hacks happen (AT&T) Time to start using services that work with your data on-premises. The DBSnapper architecture is designed for exactly this purpose. #hack #dataprivacy #platformengineering #database
To view or add a comment, sign in
-
❄️ 𝗪𝗵𝘆 𝗵𝗼𝘀𝘁 𝗠𝘂𝗹𝘁𝗶𝗺𝗼𝗱𝗮𝗹 𝗠𝗼𝗱𝗲𝗹𝘀 𝗶𝗻 Snowflake? • Flexibility: Use any model, accept various inputs and outputs, and forget about API limits. • Data Privacy: Keep your data within Snowflake, maintaining role-based access controls. • Cost Efficiency: No API costs, just compute power expenses.
🤖📄🖼️ 𝗨𝗻𝗹𝗼𝗰𝗸 𝘁𝗵𝗲 𝗣𝗼𝘄𝗲𝗿 𝗼𝗳 𝗠𝘂𝗹𝘁𝗶𝗺𝗼𝗱𝗮𝗹 𝗠𝗼𝗱𝗲𝗹𝘀 𝘄𝗶𝘁𝗵𝗶𝗻 𝗦𝗻𝗼𝘄𝗳𝗹𝗮𝗸𝗲 Imagine having the ability to run open-source multimodal models securely within your Snowflake environment. Say goodbye to external data transfers and hello to data privacy and security! In my new blog article I share how to host a quantized version of GLM-4V–9B, an open source SOTA multimodal model in Snowpark Container Services and build a Streamlit Application that supports images and PDFs as model inputs. Of course I also demonstrate how to use the model in Snowflake Notebooks and SQL! 📚 𝗨𝘀𝗲𝗰𝗮𝘀𝗲𝘀: • Build chatbot applications that work with images and documents • Extract attributes from images • Explain graphs in natural language ❄️ 𝗪𝗵𝘆 𝗵𝗼𝘀𝘁 𝗠𝘂𝗹𝘁𝗶𝗺𝗼𝗱𝗮𝗹 𝗠𝗼𝗱𝗲𝗹𝘀 𝗶𝗻 𝗦𝗻𝗼𝘄𝗳𝗹𝗮𝗸𝗲? • Flexibility: Use any model, accept various inputs and outputs, and forget about API limits. • Data Privacy: Keep your data within Snowflake, maintaining role-based access controls. • Cost Efficiency: No API costs, just compute power expenses. 🔗 𝗦𝗼𝘂𝗿𝗰𝗲𝘀: • Blog: https://lnkd.in/g6cHi35P • Github: https://lnkd.in/g-_xZ7Wf • Demo Video: https://lnkd.in/gHvnyQWW What are use cases you have in mind for multimodal models in Snowflake? #Snowflake #GenAI #Snowpark #Streamlit #OpenSource
Hey Snowflake, describe this image! Multimodal Models in Snowflake
https://www.youtube.com/
To view or add a comment, sign in
-
Life requires choices, and data architecture is certainly no exception. There is a very wide range of query engines out there, each with its own strengths, weaknesses, and community support. One recent, live survey found usage spread smoothly across 10 different query engines: https://lnkd.in/gic8Tqzc You should always be able to choose the best query engine for the job at hand. Traditional data architecture approaches limited your choice of query engines, based on the data warehouse, data lake, or data lakehouse in which your data was stored.
At one of my sessions at the Data + AI Summit last month I asked the almost full house audience a live poll of what query engines they were using. I obviously knew Apache Spark would dominate in the audience, but the additional diversity was exciting to see! Almost as many Snowflake users as Databricks users and Trino was not far behind. When it comes to data query engines, ONE SIZE DOES NOT FIT ALL... Each of these compute engines has a specialized match to certain workload types and this is why you see very prevalent diversity even in a room full of random engineers at the Data + AI Summit. <note for the nerds 🤓 , this isn't a scientific measurement of the industry or representative of actual popularity of engines, just a fun live survey of a random community audience. yes, databricks is spark, guess what? snowflake, emr, athena, fabric also can run spark> #apachespark #databricks #snowflake #trino #apacheflink #bigquery #awsathena #awsemr #msfabric #dataengineering #datalakehouse
To view or add a comment, sign in
-
Text to SQL will be a solved problem. I've been using SnowflakeDB Copilot for the past two days. Snowflake copilot is pretty fast and way faster than OpenAi. Text-to-SQL is likely to be one of the first challenges successfully tackled by large language models, and its direct integration into the data warehouse offers sheer convenience. l experimented with various types of queries, and Copilot handled easy to moderately complex queries brilliantly. It struggled with larger or more complex queries, but those are not the situations where I anticipate needing Copilot's assistance.
To view or add a comment, sign in
-
If you are coming from an analytics background and are new to Databricks , the SQL Editor is THE best place to start. 👉Using SQL Serverless compute, the SQL Editor allows you to build SQL queries that are connected to the Unity Catalog + AI Assistant. 👉A nice bonus? Great visualization capabilities for exploratory data analysis, as shown in this video. 👉The SQL Editor + SQL Serverless compute + Unity Catalog are 3 of the 5 critical elements to learn in the free tutorial I am planning on "Databricks: The Easy Way, For Non Engineers". Until then, check out the SQL Editor yourself!
To view or add a comment, sign in
-
Snowflake Cortex Fine-tuning is out on Public Preview!! ** What is it? ** Snowflake Cortex Fine-Tuning function in Snowsight, which gives you a way to customize large language models for your specific task. That means you can fine-tune models through Snowsight without writing any SQL. ** How? ** Find it in this link https://lnkd.in/gWNnrKtn, and I hope post a video later #snowflake #infostrux #datasuperhero #snowflake_advocate
Fine-tuning (Snowflake Cortex) ¶
docs.snowflake.com
To view or add a comment, sign in