Activity
Mon
Wed
Fri
Sun
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public ā€¢ 5.3k ā€¢ Free

Fabric Dojo ē»‡ē‰©

Private ā€¢ 198 ā€¢ $39/m

5 contributions to Learn Microsoft Fabric
50% price reduction for Copilot in Fabric
https://blog.fabric.microsoft.com/en-us/blog/important-billing-update-coming-to-copilot-and-ai-in-fabric/
3
2
New comment Oct 3
1 like ā€¢ Oct 3
Thanks for sharing this Amit. Am I right in concluding that the 'barrier to entry' remains that unless you have a F64 capacity then Copilot is not accessible. Last week we heard a lot about the opportunity which is great, lowering the cost of the request and response consumption is great, but we still need that underlying F64 to be available! I think the more accurate description should your Copilot capability just got a lot more efficient as you can run twice as many queries for the same price. It still cost you circa Ā£8k a month to enable it.
Migrating Power BI Reports into the Fabric Eco System
This post might highlight there is so much to learn, which is why I value any steer from the community. We are invested in using Fabric, but we have lots of 'legacy' Power BI Reports being produced via Power BI Report Server and Power BI Service platforms. We have been testing the 'best way' to migrate a 'legacy' Power BI report into the Fabric ecosystem. So far we have identified to possible approaches: 1) start from scratch but copy the 'transformation steps' of the Power BI report into a dataflow. Modify the source from legacy to an appropriate Fabric Lakehouse that holds the same data. We are pointing the output destination of the dataflow to a 'gold' Lakehouse - which is essentially equivalent to the Tables of the legacy Power BI. We then need to manually recreate relationships and add any measures that existed in the Legacy Power BI Report.. becomes a manual nightmare! 2) we start with the legacy Power BI Report - create a desktop version, modify the source so it points to the relevant Fabric Hosted Lakehouse tables, and then publish it to the relevant Fabric Workspace. This seems to work .. until it doesn't. We get the Semantic Model and Report Fabric Items, the latter has all of the legacy measures but we struggle when we need to fix things, we are currently having to go back to the Desktop report adjust things there and then republish. Option 1 seems a lot of manual work, and option 2 requires a standalone desktop Power BI Report to exist, neither options are therefore sustainable. I am specifically referring to the migration of legacy Power BI Report Server reports, I am assuming Power BI Service hosted reports will be somewhat easier. How on earth do you modify the semantic models within Fabric if you have published the report to Fabric from Power BI Desktop might be the better title for this post :-)
3
5
New comment Aug 28
0 likes ā€¢ Aug 28
hi @Surm Man thanks for your comment. Just to check when refer to 'uploading' PBIRS to MS Fabric is that the same as publishing via Desktop. Is there a way to upload directly from PBIRS?
1 like ā€¢ Aug 28
@Will Needham thanks so much this is so relevant and very helpful
Triggering Fabric Pipelines
Our current strategy is to use Databricks as a 'data engine' bringing relevant data in to Databricks before short cutting it to Fabric (Bronze) for subsequent processing and consumption. My question is there a way to orchestrate the start Fabric data pipeline after Databricks job of moving the raw data to bronze finishes. It might seem a little unconventional to use Databricks to extract the data initially but that is direction we have adopted, any guidance on how we might trigger the pipeline (in Fabric) after the Databricks extraction would be much appreciated.
1
2
New comment Aug 21
Gaining stakeholder support for Fabric - with a security lens
Hi community, I am making good, steady progress with our Fabric deployment (still in POC stage) what I am struggling with is joining some dots and presenting a straightforward story to certain members of our security stakeholder community who are concerned with hosting data in Fabric. What I was hoping the community could help me with is a list of the top ten/twenty objections to deploying Fabric. So I can craft a suitable proactive response against it. I watched iRobot for the first time at the weekend - recommended - 20 + years old now! and what I relate to is this comment: "you must ask the right questions" or put another way "Have the answers to the questions before they are asked :-)"
0
1
New comment Jul 30
Monitoring Fabric Usage - best practice
I have just started a Fabric pilot and have a single F4 capacity, our user community are being encouraged to start the trial fabric licence, which I understand means they individually enjoy F64 SKU capacity. The plan is to migrate people over to our purchased capacity as and when their trials end. We want to gather some intelligence around resource usage across each trial to anticipate any challenges with our lower SKU capacity. With this context, I have some questions: 1) How do I monitor the usage of the trial capacities? 2) Is the only way to accurately x-charge Fabric usage via individual capacities, or is it possible to use reporting to show usage by workspace? Every day I learn a little more, and so please share your own experiences.
0
1
New comment May 2
1-5 of 5
Mark Thacker
2
14points to level up
@mark-thacker-6922
I help companies improve their data quality and data management by driving alignment between data and business objectives.

Active 6h ago
Joined Apr 29, 2024
United Kingdom
powered by