Help Needed: Pipeline->Dataflows->Lakehouse->PowerBI
In the pre Fabric days is when I was fairly good with PowerBI and would use the Desktop for all the steps of importing data, transforming and then creating reports.
The client I am working with has Fabric and we want to do it "properly" but find I am getting lost with a few stages.
I have a workspace with the premium feature enabled with the diamond icon
Can someone explain, if this is possible? I may have the steps or technical terms mixed up, but this is my general understanding of what I'm trying to achieve:
  1. Import an on premise SQL into Fabric (Datapipeline?)
  2. create a Lakehouse for this data
  3. Transform and clean the data (Dataflow)
  4. have a custom (or default) semantic model attached
  5. Import the Lakehouse as a data source into PowerBI Desktop so that it inherits the semantic model AND data
  6. Create reports/dashboards in Desktop
  7. Publish: Once reports/dashboards are published they are refreshed based on the Lakehouse (frequency set by the Dataflow?)
  8. Be able to modify the entire workflow as the needs evolve
At the moment this last step (modify the workflow) seems to be the hardest part...
If this is too vague then I can provide some specific examples of the steps in which I feel like I am close to achieving this but am blocked.
Thanks!
2
4 comments
Bart Sullivan
1
Help Needed: Pipeline->Dataflows->Lakehouse->PowerBI
Learn Microsoft Fabric
skool.com/microsoft-fabric
Helping passionate analysts, data engineers, data scientists (& more) to advance their careers on the Microsoft Fabric platform.
Leaderboard (30-day)
powered by