🎄📈 Elevate Your PySpark Testing Game 🐍💡 Discover the power of dynamically loading unit tests in your PySpark notebooks. This allows you to activate them on demand, so you can run them during development or deployment but disable them daily. Our latest tip breaks down the process step by step, complete with code samples and practical tips. Level up your testing skills and streamline your Databricks workflow today! 💪 🔗 https://lnkd.in/eKj7pu4r #PySpark #Databricks #SoftwareTesting #CodeQuality
DailyDatabricks’ Post
More Relevant Posts
-
You can embed your unit tests right next to the code you write. Then using reflection in python through the inspect module! This means you can build your test suite and test runners dynamically on the fly. It can sound complicated but it's a few lines of code and it's incredibly simple to get started. 🎄 Unwrap a New Databricks Tip Every Day This December on DailyDatabricks
🎄📈 Elevate Your PySpark Testing Game 🐍💡 Discover the power of dynamically loading unit tests in your PySpark notebooks. This allows you to activate them on demand, so you can run them during development or deployment but disable them daily. Our latest tip breaks down the process step by step, complete with code samples and practical tips. Level up your testing skills and streamline your Databricks workflow today! 💪 🔗 https://lnkd.in/eKj7pu4r #PySpark #Databricks #SoftwareTesting #CodeQuality
Loading Unit Test Cases Dynamically in a Notebook – DailyDatabricks Tips
dailydatabricks.tips
To view or add a comment, sign in
-
Just saw that Snowflake has released the Public Preview of Notebooks this morning! Amazing feature. Check out some of its key capabilities: 1) Write code in the way that works best for you, whether it's SQL queries or Python scripts. 2) You can convert notebook cells into tasks and schedule them for regular execution. 3) Notebooks can be managed through Git integration. #Snowflake
To view or add a comment, sign in
-
Databricks Notebook Debugger! Here's how to leverage this feature with a small PySpark code. Requirements: 1- Enable the debugger feature from the workspace developer settings 2- Use a cluster with Databricks Runtime version 13.3 LTS or above. Thanks to Derar Alhussein Hubert Dudek Denny Lee for sharing unique features. Code in the comment section. Official Doc: https://lnkd.in/dfc25e8u #databricks #sql #python #dataengineer #sparksql #etl
To view or add a comment, sign in
-
Empowering data engineers! Do check out this comprehensive guide to building CI/CD pipelines for Snowflake, Airflow, DBT & Python by my amazing team member Sanjay R B. #DataEngineering #CICD"
TCS Innovator | System Engineer at TCS | GCP ACE & PCEP - Certified | Hybrid Application Developer | Flutter Enthusiastic
🚀 Excited to Share My First Tech Blog! 🚀 After months of hands-on experience and analysis, I've published my first Medium article, titled "Build your own CI/CD Pipeline for Snowflake, Airflow, DBT, Python & Other Configuration Files (Data Pipeline)". 🎉 This blog is a comprehensive guide for data engineers, covering the design and implementation of an end-to-end CI/CD pipeline. Whether you're dealing with Snowflake, Airflow, DBT, or other configuration files, this article is designed to help you streamline deployments efficiently and effectively. 🔗 Check it out here: https://lnkd.in/gh7JGhtp I'm eager to hear your feedback and would love for this to spark insightful discussions. If you find it useful, please like, comment, or share it with your network. Your support means a lot as I take this exciting step in sharing knowledge with the community! #DataEngineering #CICD #MediumBlog #Snowflake #Airflow #DBT #Python
Build your own CI/CD Pipeline for Snowflake, Airflow, DBT, Python & Other Configuration Files…
link.medium.com
To view or add a comment, sign in
-
🚀 Excited to Share My First Tech Blog! 🚀 After months of hands-on experience and analysis, I've published my first Medium article, titled "Build your own CI/CD Pipeline for Snowflake, Airflow, DBT, Python & Other Configuration Files (Data Pipeline)". 🎉 This blog is a comprehensive guide for data engineers, covering the design and implementation of an end-to-end CI/CD pipeline. Whether you're dealing with Snowflake, Airflow, DBT, or other configuration files, this article is designed to help you streamline deployments efficiently and effectively. 🔗 Check it out here: https://lnkd.in/gh7JGhtp I'm eager to hear your feedback and would love for this to spark insightful discussions. If you find it useful, please like, comment, or share it with your network. Your support means a lot as I take this exciting step in sharing knowledge with the community! #DataEngineering #CICD #MediumBlog #Snowflake #Airflow #DBT #Python
Build your own CI/CD Pipeline for Snowflake, Airflow, DBT, Python & Other Configuration Files…
link.medium.com
To view or add a comment, sign in
-
Did you know you can build and run KX apps using Python on Databricks? We've partnered with Databricks to integrate KX with the Data Intelligence Platform so that you can now configure both #Python and #Spark workloads to get a KX engine turbocharge without having to rewrite any business logic. Some customers are reporting analytics running up to 112x faster while at the same time reducing their DBUs.
To view or add a comment, sign in
-
Deploy your Models from Jupyter Notebooks what if you could deploy your model directly from your notebook? Introducing Modelbit, an API that simplifies deployment to Snowflake, Redshift, and REST endpoints. Seamless Integration: Deploy models straight from Python notebooks (or Git) with ease. Flexibility: Works with various ML frameworks and deployment targets. Streamlined Workflow: No more wrestling with environment files and dependencies. Log in to Modelbit: https://app.modelbit.com/ #machinelearning #deployment #datascience
To view or add a comment, sign in
-
🧠 Have you been wanting a Snowflake-native interface to - Write Python & SQL for data analysis - Perform feature engineering and run #ML models - Automate #dataengineering pipelines ... and more, with all of your code checked into #Git❓ Come chat with us on June 4! At Snowflake #Summit, Tyler Simons and I will talk about how you can do all of the above in the brand new 🌟 Snowflake Notebooks 📓!
To view or add a comment, sign in
-
💡 Generic Magic Command for Running Notebooks: To run a notebook from another notebook, use %run notebook_path. 🛠️ Databricks Utilities (dbutils): Databricks notebooks offer utility commands like dbutils for environment interaction. Use dbutils.fs.ls() to list directories of files from Python cells, like: display(dbutils.fs.ls("/databricks-datasets")) 📊 Displaying Tabular Data: Use the display() function when you have tabular data returned by a Python cell. #Databricks #Notebook #Utilities #DataVisualization 📊
To view or add a comment, sign in
135 followers