Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 5.8k • Free

12 contributions to Learn Microsoft Fabric
Grouping workspaces ?
Hi Team, Does anybody know if it possible to group workspace in de workspace overview in de fabric portal ? (I know you can pin workspace but I want to group all those workspaces for a better overview ) We are large organisatie so we have a lot of workspaces.
0
4
New comment Oct 25
1 like • Oct 25
HI - the closest I could get to this is to group my workspaces by Domain. This is then surfaced in the Home tab and acts to filter the artifacts. There is a section showing workspaces on the left (filtered by the Domain you choose). Not exactly what you are after but its the only option I have found to drill down and group workspaces.
Fabric storage size
Hi gents, question about Fabric storage size. according to MS, if you have F64, you get total 64TB storage. I wanna know if this 64TB shared by synapse data, lake house data (Parquet file) , PBI module, and all files(structure / unstructured) on data lake? and, if I suspend the Fabric capacity during the night time, is this objects stored in Fabric still accessible? Let say a csv file, etc. thank for answering.
0
3
New comment Oct 25
1 like • Oct 25
To add to Wills final point about access when you pause the capacity. If you pause capacity 'A', you can still access storage items form a different capacity 'B', and charges go against the calling capacity doing the compute (Capacity 'B'). This is a great flexible approach as it means you can have a very large capacity to process larger and more complex workloads, run it only when required and then pause it. You get the benefit of all the extra compute when you need it and get billed only for the time it remains un-paused. The final data is fully accessible after - potentially using a smaller capacity to read it etc.
Data Pipelines and Dafaflow Gen2
I know most of you will be aware of the limitations of Pipelines inside Fabric - in that not all items are currently supported, specifically Dataflow Gen2 unfortunately. I wanted to share some feedback I have form the MS Development team on where they are with this and the timelines around anything changing. I asked from them to allow me to decide on where my Data Engineers should invest their time - you can draw your own conclusions on priorities based on the same information :_ "" Dataflow Gen 2 is not currently supported in deployment pipelines. You cannot use the standard Git integration either at present. The Git integration is likely to arrive in Q4 CY25. In the meantime, it is recommended to do something like the following 1. Export the Dataflow as a Template: 2. Open the Dataflow Gen2 you want to move. 3. Use the Export template option in the toolbar to save the dataflow as a PQT file (Power Query Template) to your local machine. 4. Check this into version control. 5. Prepare the Target Workspace: Open the target workspace where you want to deploy the dataflow. Create a new Dataflow Gen2 in this workspace. 6. Import the Template: In the new Dataflow Gen2, select the option to Import from Power Query template. Upload the PQT file you exported earlier. 7. Adjust Connections and Settings: Update any data source connections if necessary. This might involve changing workspace IDs or other identifiers to match the new environment. Ensure that all steps in the dataflow are correctly configured to point to the appropriate resources in the new workspace. 8. Publish the Dataflow: Once all adjustments are made, publish the dataflow in the new workspace. I appreciate this is a not ideal, but we are making efforts to bring DFGen2 in line with the rest of the deployment pipeline capability. ""
3
4
New comment Oct 21
0 likes • Oct 18
so do I !
'Migrating' from Power BI Premium to Fabric Capacity
Hi everyone, I know a lot of you are facing the requirement to move from Power BI Premium capacities, to Fabric Capacities, because of the sunsetting of the Premium capacity licensing. This blog post by David Mitchell (Microsoft) - walks you through a few different options for you, including a manual switch, and the use of semantic-link-labs. Let me know your thoughts, is this something you're planning?
25
14
New comment Oct 19
'Migrating' from Power BI Premium to Fabric Capacity
2 likes • Oct 14
thanks - will have a read as we have multiple P1/2 capacities and now a collection of F capacities - it all needs aligning in the simplest and cleanest way possible. (We also have Databricks and Synapse too for good measure - nothing like keeping it simple)
Spliting capacities per environments or zones
Hi falks, I am planing to create a capacity for dev/qa environemnts and a second one for production where we plan to locate the gold zone for semantic model and powerbi consumption. The question is, should be booth capacities of the same size? Should be the reporting capacity bigger than the capacity for data engineers?
2
4
New comment Oct 13
2 likes • Oct 11
Agree with Will but you will find over time that most likely your prod gold capacity will require the largest size. Logically, once development is complete you migrate to production, and generally it is there that schedules and refreshes are initiated. Thus, you will have much more compute happening. In Dev it is more sporadic, and needs based on the development stages, so far less activity. Same impact for Power BI processing - Dev. / Test naturally has less users whereas your production will have your company / team / division all actively using your creations - much higher compute needs. One final observation is if you intend to enable Copilot then your compute will grow very quickly - it is currently a very greed beast demanding a disproportionate amount of compute for the benefits (at least from my testing) - treat with caution as over-eager implementation of it can quickly consume all you have available. So start small, monitor and be prepared to scale out - remember that F64 is the baseline for a number of full features so that's always worth bearing in mind too. (You can always start and stop the capacity for Development too - that saves quite a lot of money if you have a single time zone for your development team).
1-10 of 12
Dean Hall
3
32points to level up
@dean-hall-4556
Head up a team using Fabric and Synapse as Data Platform for a large corporate Co.

Active 35d ago
Joined Jun 7, 2024
powered by