We’ve just added new capabilities to Databricks Workflows, making it even easier for data engineers to monitor and diagnose issues with their jobs. The latest enhancements include a Timeline view for job runs, a run events feature to visualize job progress, and integration with #DatabricksAssistant, our AI-powered Data Intelligence Engine. https://dbricks.co/3TkLjcc
For sure, it will help improving in engineers’ productivity! Thanks Databricks !!
While the feature is excellent, it could benefit from more targeted solutions. Occasionally, when diagnosing issues, the results can lead to tangents or unrelated problems. Overall, it's a fantastic tool that I regularly use
Valuable new features to make the experience better! 👏
Yet to explore the new feature within workflows. I am sure it will improve productivity but looking to see if the feature may have better explanation for specific error messages that requires lot of time spent checking and parsing logs until now!
Also, This AI assistant need improvement. For a simple cluster permission issue it is started rewritten the whole code and asking to improve code instead of just saying cluster permission should be added.
Excellent features, Databricks!
Hi Team, great if you can add dynamic parameter for notebook path when creating task inside workflow as we have in ADF.
The diagnose error button simplifies the data engineer's live
Databricks one of the few companies actually innovating out there.
Exciting updates! The new Timeline view and run events feature will undoubtedly enhance workflow monitoring and troubleshooting. We're especially eager to see how the Databricks Assistant integration will empower data engineers. Kudos to the Databricks team for continuously pushing the boundaries!