How did I not notice that Tableau Prep can't do spatial calculations before? Are they not in because they can't be "seen" in Prep?
Why do I need them?
Trying to do #IronQuest and get ready for Data + Maps Tableau User Group. I wanted to dynamically calculate the difference between the #F1 GP tracks to see optimal "next race" routing. Since Tableau (Desktop) can only do spatial calcs on a row level, I wanted to calculate distances between each and then PIVOT them so I can get a max/min. I can get the distances in Desktop but then I can't pivot calculated fields.
This seems like it should be easier?!!?
Sarah Bartlett Dennis KaoTore Levinsen
Forward Deployed US Army National Guard Finance Officer ✯ Lead Account Solution Engineer at Tableau (Salesforce) ✯ Tableau Public Featured Author ✯ 4x Tableau Public #VOTD ✯ Trailhead 4x-Star Ranger
❓ Do you use Bravo by SQLBI when building Power BI semantic models?
As well as analysing your model, my favourite feature of Bravo is the built-in UI for creating date tables.
Not only does it provide the ability to easily customize your Date table, but will also automatically create any Time Intelligence measures you might need.
A huge time saver!
Check it out here - it's free!
https://bravo.bi/#powerbi#data#sqlbi#bravo
Tableau is indeed a playground, i was able to analyze a new dataset from scratch ,various visualizations:Dual Axis graphs, scatter plots etc, calculated fields,Bins and bin sizes, i also leveraged the power of interactivity through: filters, visualizations as filters, parameters.
ISE @ AIT Tumkur || Python || PowerBI || Cyber security || Script kiddie || web development || JAVA || GenAI || Figma || Kali Linux Tools || Bug Bounty
🚀 Excited to share my latest project in Power BI!
I recently learned Power BI through live sessions from WsCube Tech and created this insightful dashboard showcasing transaction data (PhonePe Pulse).
This experience enhanced my skills in data visualization, helping me transform complex datasets into meaningful insights.
Looking forward to applying these skills in real-world projects! Feedback and suggestions are welcome. 🙌
#PowerBI#DataVisualization#LearningJourney#DashboardDesign#PhonePePulse
Anyone who thinks it’s a great idea to put your semantic layer inside a BI tool such as Power BI or Tableau, has clearly never managed a data platform at scale.
Although these tools make it very convenient and easy to create the semantic models, the engines underneath are usually very inefficient, requiring a ton of compute.
And then to address the elephant in the room, what happens when you need to migrate to another tool? All that work has to be unwound and redone elsewhere. And you better hope alternative connectors exist to repoint your dashboards. 😂
Bottom Line - keep your semantic layer as close to the data as possible, ideally in views at the source or close to it. You can also consider a virtual cube such as atScale or CubeDev to name a few.
#dataengineering
At most organizations, the BI layer is pure chaos.
Hundreds, sometimes thousands, of people build disparate semantic models with little collaboration or visibility across silos. The result? Rising costs, slower performance, and—most importantly—a loss of trust in analytics.
As Matt points out, shifting logic out of the BI tool and into a true semantic layer is a key step to mitigate this sprawl and ensure consistency in reporting.
At Datalogz, we recently introduced a semantic layer recommendations across Power BI, Tableau, and Qlik. Datalogz Control Tower identifies overlapping business logic and expressions for reuse, enabling teams to move that logic into the warehouse or a true semantic layer. The general rule? If a model/expression/measure is being used more than five times across departments, it’s time to move it out of BI!
Bottom line: The closer your semantic layer is to the data source, the better it scales and performs—helping your teams save time, costs, and complexity.
Staff Tech. Engineer - Enterprise Data at State Farm ®
Anyone who thinks it’s a great idea to put your semantic layer inside a BI tool such as Power BI or Tableau, has clearly never managed a data platform at scale.
Although these tools make it very convenient and easy to create the semantic models, the engines underneath are usually very inefficient, requiring a ton of compute.
And then to address the elephant in the room, what happens when you need to migrate to another tool? All that work has to be unwound and redone elsewhere. And you better hope alternative connectors exist to repoint your dashboards. 😂
Bottom Line - keep your semantic layer as close to the data as possible, ideally in views at the source or close to it. You can also consider a virtual cube such as atScale or CubeDev to name a few.
#dataengineering
Forward Deployed US Army National Guard Finance Officer ✯ Lead Account Solution Engineer at Tableau (Salesforce) ✯ Tableau Public Featured Author ✯ 4x Tableau Public #VOTD ✯ Trailhead 4x-Star Ranger
4moIf you have latitude and longitude coordinates, you can mimic the distance calculation using Haversine distance: https://www.thedataschool.co.uk/asha-daniels/finding-the-closest-distance-in-tableau-prep/