User
Write something
Fabric Q&A is happening in 25 days
Updating Semantic model schema
If we added a new column in a table in the warehouse which is being used in a semantic model, how do we refresh the semantic model schema without removing and addind back the table to get the additional column?
1
6
New comment 5m ago
Fabric Link vs Synapse Link...real costs?
Morning everyone. I am working on a project that requires data from D365 F&O and the implementation partner recommends Synapse Link. Before I challenge them about using Fabric Link, has anyone come across the same scenario? Is fabric Link genuinely better, more efficient or are there hidden storage costs versus using a different DataLake and copying the data out?
2
4
New comment 2h ago
Triggering Cross Workspaced Notebook Authentication Error
Hi there, I am running notebook "nb_load_stage_Customer" from lakehouse "Customer_LH"(silver Workspace) which is reading data from files directory that is located in bronze_WS & same lakehouse). Further it loads data to stage delta table (Silver_worspace.Customer_LH/Tables/Schema_Customer/tbl_Stage_Customer) when I execute it from the notebook cell it works fine, but as soon as I trigger it from Pipeline "pl_customer_stg_load" (which sits in silver WS ) it fails throwing below error: { "error": { "code": "Unauthorized", "message": "Authentication Failed with Bearer token is not present in the request" } } PS: Pipeline parameters and values are perfectly fine. My initial finding are related to the bearer token authentication is not passed between the two workspaces. Fabric has trouble reading the data from bronze workspace. I think there is a need to set up Managed Identity/SP for cross workspace authentication. Need help! Thank you!
0
0
How to create Hierarchy in Fabric Lakehouse Tables
Hi! Is it possible to create hierarchy in fabric lakehouse tables ? Appreciate your help and attention.
2
2
New comment 8h ago
Incremental Refresh Pipeline
Hello everyone in the community!👋 I have a question about a manufacturing environment I'm working on. My data is in a Lakehouse and consists of very wide tables sourced from BigQuery. I’m trying to create a pipeline that uses watermark values for incremental updates of these tables. So far, I have implemented the following: - LookupOld: reads the Delta table that is fed by the latest date for comparison with the new one. - LookupNew: retrieves the date of the last update (using a concatenation of two columns). My incremental copy activity has a destination folder where TXT files with the suffix "incremental" are generated. However, I have a problem: I can't update the Old date at the end of the pipeline cycle. Although I could do this with a Stored Procedure, I always receive the following error: "Execution fail against SQL Server. Please contact SQL Server team if you need further support. Sql error number: 24559. Error Message: Data Manipulation Language (DML) statements are not supported for this table type in this version of SQL Server." I understand that we cannot perform DML, but it also doesn't allow me to save the variable. I'm looking for an alternative to update Old with the update date so I can achieve a difference. My goal is to achieve an incremental refresh based on the difference between the lookups. I’ve been following a tutorial and have tried saving variables and creating notebooks to manage the process, but I still haven’t found a suitable solution for my context. Incrementally load data from Data Warehouse to Lakehouse - Microsoft Fabric | Microsoft Learn Does anyone have any suggestions? I appreciate any help!🫡
0
0
Incremental Refresh Pipeline
1-30 of 453
Learn Microsoft Fabric
skool.com/microsoft-fabric
Advance your data career by learning the hottest new data analytics platform🔥Don't learn alone, learn in our friendly community of Fabricators!
Leaderboard (30-day)
powered by