Honestly, SQL Database in Fabric is great news! It fills a big gap in Power BI. But there's a catch. Microsoft, we need a per-user Fabric license. Read more: https://lnkd.in/d-zDif8K #honestly #powerbi #fabric #sqlserver #sqlazure #sqlfabric
One thing to consider: You call out that the Datamart feature was in preview since 2022 (I was excited about that one as well). Isn’t the new capability also in preview? I would have hoped for it not to be. I also second the lack of PPU on Fabric, and I’ll even add, the lack of a TRUE on-demand pricing similar to how Databricks and Snowflake do it.
It makes sense that Microsoft aims to maximize value from their products, but I’ve spoken with many SMEs that hesitate to adopt Power BI due to licensing costs. Do you think they'll ever make it more affordable for smaller companies Marco Russo?
Thanks Marco. Q: for analytics (star schema connected to Powerbi ) is still better to use Fabric Warehouse (columnar database) instead of Fabric Sql Server? Fabric Sql Server will be used more for transactional data in Fabric. is it correct?
Yes why not to add this new OLTP SQL workload on Fabric But good luck now , to the person in charge of CUs tracking and objects access on F capacities Big Big deal We need urgently , on my opinion Microsoft to offer on Fabric 1) An access audit service ( WS , Specific WH/ LH KQL / SQL DB / Semantic model Access , PBI specific access , one lake tables and files specific access) 2) A fine F capacities monitoring CUs consumptions, problems Powered with Copilot for strong advises
I was hoping the additional Power BI license cost in April 2025 would cover a per user Fabric License Marco
Interesting idea, but everything in Fabric is measured in capacity units. How much should a per user license cost? I guess you can take the [FPU price] / [F2 price] to see how much capacity you could get with that. I think that if you have more than a few excels you will run into trouble. Can’t a few small Excels just as well be handled in PQ?
Game changer would be if CRUD entry was immediately processed by Vertipaq. So real-time writeback to semantic model. I guess we still need to wait…
Hi Marco Russo Great study , thanks a lot We can start to read on some benchmarkings posts / studies this OLTP SQL workload sounds faster than the Warehouse workload for Analytics SELECT .... Hoping it's jokes 😁 Bench. contexts ------------------- - Import mode to semantic model - ISO SQL tables size ( schema & rows number ) - And without to put columnstore index on the OLTP SQL table <== strong Analytical boost for : SELECT.... gold data - Same Workspace (meaning same F capacity (CUs provider)) Do you know where we can read for our Warehouses some "fine" technical information ? (Analytics read SQL queries dedicated / MPP/Olap/columnar storage etc..) 😉 -------------------------------------------- gold transformations on Warehouses ? Sure it's missing : -Some mandatory T SQL functions (merge , merge with $output append rows, temporary tables ...) -Some mandatory features (rows snapshot via temporal table , identity columns etc..) - Primary & Foreign keys on Tables - "Zero use" as integrity controls but these constraints can enhance join optimizations -Something like DBT cloud to replace TSQL stored procedures -More insights on the Web SQL end point console (query plan ..etc.)
Marco Russo could you create an idea in https://ideas.fabric.microsoft.com/ for “Fabric per-user license” and a smaller version of a SQL Database, reduced in size and with other capabilities? Datamarts are dead, and many clients don't see the points paying for Fabric F2 (stucked with dataflow). Your name would mean a lot for the idea, and many self-service Power BI users would vote for it Time to campaign !
Director of Analytics and AI | Helping businesses do more with data
4wIndeed, a per user Fabric license would be nice as an option. At the same time, if you have modest requirements and just need the capacity for an hour or two a day on working days, it may actually be cheap enough to just use an F2 or F4 SKU when you need it and pause it when not in use. Automating the pausing and resuming of the capacity on a schedule with Azure Automation isn’t too much effort either and only needs to be set up once 😎