Amazing story of ThredUp building a sustainable/efficient data platform, after years of leveraging a legacy cloud data warehouse and accompanying catalog. I especially like the timeline they laid out for each technical decision in their architecture. Great story of building speed and quickness to fight in a competitive space! #unitycatalog https://lnkd.in/gm7y6Rb5
Sheridan McDonald’s Post
More Relevant Posts
-
The lakehouse has quickly become the rising standard for data architecture. And Databricks, the inventor of the lakehouse, has been cited by Forrester as delivering a cutting-edge AI-enabled lakehouse platform at cloud scale. Get The Forrester Wave™: Data Lakehouses, Q2 2024 report to see why Databricks was named a Leader.
Databricks named a Leader in the 2024 Forrester Wave for Data Lakehouses
databricks.com
To view or add a comment, sign in
-
Data lakes represent the backbone of modern data storage, whether they’re used for analytics, #machinelearning, or #AI applications. In this blog post: https://okt.to/rFaq3m, Evan Smith unpacks the #datalake and compares it to other data technology, including databases, data warehouses, and data lakehouses. With the industry pivoting towards the next generation of open data lakehouses powered by Apache Iceberg, Delta Lake, and Hudi, understanding the power of data lakes built on cloud object storage is more important than ever. #opendatalakehouse
What is a data lake?
https://www.starburst.io
To view or add a comment, sign in
-
Data is the fuel for AI. Modern data like real-time, unstructured, and user-generated data require different solutions, which AWS provides through services like S3, Athena, and SageMaker. #aws #awscloud #cloud #amazonmachinelearning #amazonsimplestorageservices3 #architecture #artificialintelligence #awsbigdata #bestpractices #database #generativeai #storage #artificialintelligence #bigdata #datamesh #letsarchitect #machinelearning
Let’s Architect! Modern data architectures
aws.amazon.com
To view or add a comment, sign in
-
"An increasing number of companies are operating multiple instances of single-tenant applications or expanding applications to such a scale that horizontal database-level sharding becomes essential. While there are various solutions to this complex issue—which we'll explore in future discussions—one pressing challenge remains consistent: the fragmentation of data across disparate systems." Read more in the link about how we've helped our clients save money by leveraging Azure Databricks instead of Data Factory for use cases like this one. #databricks #azure #reimaginedhealth
How to Load 12,000 Tables to Your Data Lake in 20 Minutes: Saving $100K per Year Moving from Azure Data Factory to Databricks
https://as-recon-webwp-prod.azurewebsites.net
To view or add a comment, sign in
-
Transitioning to #databricks has revolutionized operations, offering lightning-fast extract pipeline completion and unparalleled cost-effectiveness. The newest article from Reimagined Consulting guides you through the challenge, available options, and our success in saving a client more than $100,000 annually in expenses. It is the ultimate solution for maximizing efficiency and savings in #datamanagement!
"An increasing number of companies are operating multiple instances of single-tenant applications or expanding applications to such a scale that horizontal database-level sharding becomes essential. While there are various solutions to this complex issue—which we'll explore in future discussions—one pressing challenge remains consistent: the fragmentation of data across disparate systems." Read more in the link about how we've helped our clients save money by leveraging Azure Databricks instead of Data Factory for use cases like this one. #databricks #azure #reimaginedhealth
How to Load 12,000 Tables to Your Data Lake in 20 Minutes: Saving $100K per Year Moving from Azure Data Factory to Databricks
https://as-recon-webwp-prod.azurewebsites.net
To view or add a comment, sign in
-
"Why is Sigma so blazingly fast on my 40 billion row table?" - asks every Sigma prospect customer when we run proofs on their product Cloud Data Warehouse....enter Alpha Query and Partial Evaluation with Sigma! Nerd out a little and discover how these techniques can revolutionize your data analysis process. #DataAnalytics #Sigma #Snowflake #Databricks
Sigma’s Innovation: Alpha Query and Partial Evaluation Explained
sigmacomputing.com
To view or add a comment, sign in
-
🚀 Excited to share my latest article, "Breaking the monolith: Scalable transformations with Data Mesh and dbt". Whether you're a data engineer, architect, or just passionate about modern data strategies, I hope you'll find valuable insights in this piece💡. #dbt #datamesh #dataplatform #engineering #bigdata #cloud #datumo
💡𝐍𝐞𝐰 𝐁𝐥𝐨𝐠 𝐀𝐥𝐞𝐫𝐭💡 𝐁𝐫𝐞𝐚𝐤𝐢𝐧𝐠 𝐭𝐡𝐞 𝐌𝐨𝐧𝐨𝐥𝐢𝐭𝐡 𝐰𝐢𝐭𝐡 𝐃𝐚𝐭𝐚 𝐌𝐞𝐬𝐡 𝐚𝐧𝐝 𝐝𝐛𝐭! In our latest blog by Przemysław Sapkowski, we examine how Data Mesh and dbt are transforming data management - by dismantling traditional silos and enabling scalable, decentralized data architectures built on shared foundations. 📈 Learn how to apply these principles in your project using the dbt-meshify CLI tool to foster innovation within your organization. 💡 𝐑𝐞𝐚𝐝 𝐭𝐡𝐞 𝐟𝐮𝐥𝐥 𝐚𝐫𝐭𝐢𝐜𝐥𝐞 𝐭𝐨 𝐥𝐞𝐚𝐫𝐧 𝐦𝐨𝐫𝐞! 📌 https://lnkd.in/dVsE8PD5 #dbt #datamesh #dataplatform #engineering #bigdata #cloud #datumo
Breaking the monolith: Scalable transformations with Data Mesh and dbt
datumo.io
To view or add a comment, sign in
-
Snowflake is a unified, comprehensive, global and highly available data cloud. Building a data cloud architecture capable of powering a diverse set of data and AI applications is an enormous technical challenge. This blog post presents the most important design elements in Snowflake’s architecture and is the first in a series of posts where we will share in more detail the technical challenges and novel solutions we’ve built to eliminate complexity for users and dramatically expand what they can do with their data. #fedhealthIT #snowflake https://lnkd.in/esTS9v_T Snowflake Public Sector | Adam Stillwell
Snowflake: Building a Data Cloud
medium.com
To view or add a comment, sign in
-
🚀 Understanding Data Lakes, Data Warehouses, and Data Lakehouses is critical for modern data management. Whether you’re navigating structured data, raw unstructured data, or seeking flexibility, each architecture has its role. Check out how cloud platforms like AWS, GCP, Azure, Databricks, and Snowflake offer solutions for each. If you're looking to optimize your data strategy and need expert guidance, Polar Packet is here to help orchestrate your journey from data storage to advanced analytics. Let us be your data partner! 🌐 #DataAnalytics #CloudSolutions #PolarPacket #DataLakes #DataWarehouses #Lakehouse #CloudComputing https://lnkd.in/gErdK65q
Data Lake, Data Warehouse, and Data Lakehouse: A Technical Comparison of Modern Data Solutions
polarpacket.com
To view or add a comment, sign in
-
💡𝐍𝐞𝐰 𝐁𝐥𝐨𝐠 𝐀𝐥𝐞𝐫𝐭💡 𝐁𝐫𝐞𝐚𝐤𝐢𝐧𝐠 𝐭𝐡𝐞 𝐌𝐨𝐧𝐨𝐥𝐢𝐭𝐡 𝐰𝐢𝐭𝐡 𝐃𝐚𝐭𝐚 𝐌𝐞𝐬𝐡 𝐚𝐧𝐝 𝐝𝐛𝐭! In our latest blog by Przemysław Sapkowski, we examine how Data Mesh and dbt are transforming data management - by dismantling traditional silos and enabling scalable, decentralized data architectures built on shared foundations. 📈 Learn how to apply these principles in your project using the dbt-meshify CLI tool to foster innovation within your organization. 💡 𝐑𝐞𝐚𝐝 𝐭𝐡𝐞 𝐟𝐮𝐥𝐥 𝐚𝐫𝐭𝐢𝐜𝐥𝐞 𝐭𝐨 𝐥𝐞𝐚𝐫𝐧 𝐦𝐨𝐫𝐞! 📌 https://lnkd.in/dVsE8PD5 #dbt #datamesh #dataplatform #engineering #bigdata #cloud #datumo
Breaking the monolith: Scalable transformations with Data Mesh and dbt
datumo.io
To view or add a comment, sign in