Activity
Mon
Wed
Fri
Sun
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 5.3k • Free

Fabric Dojo 织物

Private • 199 • $39/m

6 contributions to Learn Microsoft Fabric
Secure connection between onprem sql server and Microsoft fabric lakehouse
Hi again. Using private link to secure connection between Azure sql and fabric has now been more straigh forward task. But setting up a secure line to an on prem sql server is more complicated. In my case i cannot use the onprem gateway because I need to Protect the source data. Since there is no direct private link to on prem sql I must use private virtual network with other Azure components to make this work. Propaply I need to use an ADF pipeline to get the data and push mechanism to a Bronze lakehouse or a Azure gen2 storage. Have any of you done something similar, please send a response on my request
0
0
Tips on using log analytics in Microsoft fabric
Hello. Does anyone have good experience with using log analytics in a fabric solution? I need some good tips to set up the log analytics workspace and the properties needed to use it in logging in the pyspark code. Feel free to say a little about challenges and experiences you have had
0
0
Fabric, private link and on prem sql database
Hi I have read that there Are limitstions with fabric and on prem data gateway and private links/vmet. Do anyone has any experiences with this?
0
2
New comment Oct 8
1 like • Oct 8
@Lukasz Obst thanks for your great answer on my post. I will start looking to all your details
Metadata driven framework in Microsoft fabric with Git/CICD
Hi everyone! I was wondering if any of you has experience with more advanced workspace architectures when coming to metadata driven architectures and data factory pipelines to handle them and also how to join it in CI/CD pipelines. In a Dev,Test,Prod environment with Bronze, Silver and Gold workspaces we end up with 9 workspaces with their lakehouses The reason of this architecture is security demands of incoming data to Bronze and processing in Silver, and presentation through Gold making the semantic model there for end users or data scientists more interested in datasets (that is interesting too, maybe data scientists needs more "secured data" ...) But how to arrange the data factory pipelines? I must drive data through all workspaces and update data in the lakehouses in their separate workspaces. I think it is possible to have all pipelines in one workspace either an additional Pipeline/notebooks workspace like a similar to the last image or place them for instance into the Bronze area for instance. I have tested that I can use pipelines to copy data to lakehouses between workspaces. But in a metadata driven way all workspaces lakehouses must be parameterized. Then how to store all this in Git? I think that all layers Bronze, Silver and Gold can have their own folder in devops placing the workspace items there. I think the deployment pipelines will have a bronze, silver and gold pipeline driving them through dev, test and gold since I think it may be to complicated to put them all in one deployment pipeline, or? In azure devops I think this can be controlled better. puh, I am sorry to put to much into this thread, I give you some of my sources for my thoughts perhaps you understand better. Have a nice day or weekend https://github.com/Azure-Samples/modern-data-warehouse-dataops/blob/main/single_tech_samples/fabric/fabric_ci_cd/README.md
2
3
New comment Sep 28
Metadata driven framework in Microsoft fabric with Git/CICD
1 like • Sep 28
@Will Needham hi will. Thanks for fast respons on my post. I still looking for a good pattern. I come over another one today where ii can put bronze,silver and gold in the same workspace and then have 1-n use case centric workspaces. In my case there may be not to many.
0 likes • Sep 28
The use case workdpace will be fed from gold lakehouse😊
Logging access attemps in fabric
Hi. I have a use case in a poc that I am starting on this week. Here I need to investigate that the solution is able to log all access attempts on datasets/data lakehouses/ data warehouses as well as Power bi reports. suspicious attempts must be reported. This is because there are stricts requirements on security. Has anybody some thoughts or ideas how to develop a good solution for this?
0
2
New comment Sep 3
0 likes • Sep 3
Thanks for your fast response. My first idea was digging some power bi/logs that might give me something. But I will Continue on this and hopefully come back with more detail. Btw I was surprised that semantic model for direct lake does not have calculated Columns. I am sure there Are good reasons for that
1-6 of 6
Geir Forsmo
2
15points to level up
@geir-forsmo-9198
Data engineer at Atea

Active 9h ago
Joined Jul 4, 2024
powered by