Activity
Mon
Wed
Fri
Sun
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public โ€ข 4.2k โ€ข Free

Fabric Dojo ็ป‡็‰ฉ

Private โ€ข 97 โ€ข $29/m

6 contributions to Learn Microsoft Fabric
Triggering Cross Workspaced Notebook Authentication Error
Hi there, I am running notebook "nb_load_stage_Customer" from lakehouse "Customer_LH"(silver Workspace) which is reading data from files directory that is located in bronze_WS & same lakehouse). Further it loads data to stage delta table (Silver_worspace.Customer_LH/Tables/Schema_Customer/tbl_Stage_Customer) when I execute it from the notebook cell it works fine, but as soon as I trigger it from Pipeline "pl_customer_stg_load" (which sits in silver WS ) it fails throwing below error: { "error": { "code": "Unauthorized", "message": "Authentication Failed with Bearer token is not present in the request" } } PS: Pipeline parameters and values are perfectly fine. My initial finding are related to the bearer token authentication is not passed between the two workspaces. Fabric has trouble reading the data from bronze workspace. I think there is a need to set up Managed Identity/SP for cross workspace authentication. Need help! Thank you!
0
0
FABRIC DOJO CALL RECORDING!
๐Ÿ™ Thanks to everyone who joined for the Fabric Dojo webinar and to those that have signed up already! Find the recording below to learn all about it! ๐Ÿ‘‰ Sign up to Fabric Dojo here: https://www.skool.com/fabricdojo โ—Sign up before 7th Sept to lock-in the super early bird price! See you in the Dojo! Our first call is the Onboarding Call on Monday, then the first Dojo Live call will be on Saturday 7th Sept!! Any remaining questions? Ask below ๐Ÿ‘‡
23
21
New comment 11d ago
FABRIC DOJO CALL RECORDING!
0 likes โ€ข 13d
@James Jie Hey James, can you please tell the process how can I use fabric using my personal account. I have just credited 150$ azure pay as you go subscription but when i log into the fabric, it doesn't allow me to using my personal email id. :(
0 likes โ€ข 13d
@Will Needham Thank you WIll, this is what i wanted :)
Connecting Azure Fabric with Onprem FTP server
Hi there, I have a use case where i need to bring some text files from a sFTP server (which is located in client's onprem secured server- a private network probably), but I am unable to connect that server from Fabric. It's a direct connection to the FTP server, and the error I am getting is: as below (attached Image). Any help would be much appreciated! @Will Needham
2
11
New comment 11d ago
Connecting Azure Fabric with Onprem FTP server
1 like โ€ข 13d
@Jerry Lee okay it looks like self hosted integration runtime. We could connect to onprem server using this IR via handshaking using keys in Azure data factory. But I am really not sure how it works in ms fabric echo system. I have not heard of using any sort of IR in fabric.
0 likes โ€ข 13d
@Jerry Lee Thats interesting indeed!
Creating Lakehouse subdirectory using fabric Notebook
Hi Folks, I am struggling to create lakehouse subdirectory using notebooks as dbutils and mssparkutils both modules are not supportive by MS Fabric. Did anyone tried and succeeded yet? Please share. Thanks
2
4
New comment 14d ago
0 likes โ€ข 14d
@Will Needham Thank you, I will try and revert back.
Power BI/Fabric Deployment Pipelines: do they allow version control version capabilities?
Hello Everyone, do the Power BI/Fabric Deployment Pipelines,allow version control to travel in time to a previous deployed artifiact? for example imagine that i had deployed a certain semantic model based on a dev lakehouse and a report, from dev to prod, but after a while we discovered that the semantic model had some data issue, and we wanted to revert back to the latest stable version of the lakehouse, semantic model and report. is that possible without the integration of Git?
2
5
New comment 27d ago
2 likes โ€ข 27d
@Mohammad Eljawad : I dont think it could be achievable without integrating Git because deployment pipeline option targets to deploy the artifacts into higher envs. However, you may take manual backups of each artifacts i.e., lakehouse (csv/parquet format) ,semantic model (datastore in .pbix file) and reports to again .pbix file of latest stable version of production env (preferably). Then do the changes in dev and deploy into test then prod. You can troubleshoot the problematic objects and can delete them and restore from the backup just in case.
1-6 of 6
Kalicharan Khetwal
2
13points to level up
@kalicharan-khetwal-6707
I am a data engineer in a US based MNC. I have 10+ years of experience in data field.

Active 5h ago
Joined Jul 27, 2024
powered by