User
Write something
Fabric Q&A is happening in 25 days
Pinned
👋 New joiner? WELCOME! Two things for your first day...
1. 📖 GET UP TO SPEED: Read everything in here: #Welcome to the community! - it's a short course to help you understand the community, why it exists, what you can do here, and the community guidelines. 2. 🤝 INTRODUCE YOURSELF in this thread below! We're a friendly bunch, all with the goal of learning Fabric. Not sure what to say? You can use the template below: Hi, I'm ___, I'm from ___, I work in/ as a ___ I'm really interested in learning more about _______ ? Thank you for engaging and joining us on this exciting learning journey! 🙏 Will
Complete action
163
809
New comment 3h ago
Exciting News: I Cleared the DP-600 Exam! Here's How I Did It
I’m thrilled to share that I recently passed the DP-600 certification exam, and I want to share my journey to help others in the community who are also preparing for it. I hope my experience provides you with some helpful insights and inspires you to reach your own goals! Here’s what worked for me: 1. Microsoft Learn Portal: This was my go-to resource. I focused on completing the DP-600 study guide, which includes excellent lab exercises that reinforce the core concepts. It’s incredibly hands-on! ➡️ Microsoft Learn - DP-600 Guide 2. Will Needham's YouTube Videos: Will's videos were game-changers for me. His in-depth explanations of the core concepts and exam structure really helped clarify things. His exam-focused approach made the difference in my preparation. ➡️ Will Needham's DP-600 Playlist 3. Skool Community Portal: To dive deeper into specific topics, I turned to the Skool Community portal. It’s a great place to further understand concepts covered in Will's videos and to get additional resource support. ➡️ Skool Community Portal - Microsoft Fabric 4. Microsoft Practice Test: The practice test on Microsoft Learn was invaluable in gauging my understanding and readiness for the exam. ➡️ Microsoft Practice Test 5. MeasureUp Practice Test: I invested in a 30-day subscription to MeasureUp’s practice tests, and it was worth every penny. The detailed explanations for each question helped me identify areas where I needed more study. I took the test 2-3 times to solidify my knowledge. 6. Microsoft Exam Readiness Zone Videos: After covering all the above, I watched the exam readiness zone videos to understand the overall exam objectives. These videos helped me review the key topics right before taking the exam. ➡️ Exam Readiness Zone
9
6
New comment 23m ago
Exciting News: I Cleared the DP-600 Exam! Here's How I Did It
Triggering Cross Workspaced Notebook Authentication Error
Hi there, I am running notebook "nb_load_stage_Customer" from lakehouse "Customer_LH"(silver Workspace) which is reading data from files directory that is located in bronze_WS & same lakehouse). Further it loads data to stage delta table (Silver_worspace.Customer_LH/Tables/Schema_Customer/tbl_Stage_Customer) when I execute it from the notebook cell it works fine, but as soon as I trigger it from Pipeline "pl_customer_stg_load" (which sits in silver WS ) it fails throwing below error: { "error": { "code": "Unauthorized", "message": "Authentication Failed with Bearer token is not present in the request" } } PS: Pipeline parameters and values are perfectly fine. My initial finding are related to the bearer token authentication is not passed between the two workspaces. Fabric has trouble reading the data from bronze workspace. I think there is a need to set up Managed Identity/SP for cross workspace authentication. Need help! Thank you!
0
0
How to create Hierarchy in Fabric Lakehouse Tables
Hi! Is it possible to create hierarchy in fabric lakehouse tables ? Appreciate your help and attention.
2
2
New comment 3h ago
Incremental Refresh Pipeline
Hello everyone in the community!👋 I have a question about a manufacturing environment I'm working on. My data is in a Lakehouse and consists of very wide tables sourced from BigQuery. I’m trying to create a pipeline that uses watermark values for incremental updates of these tables. So far, I have implemented the following: - LookupOld: reads the Delta table that is fed by the latest date for comparison with the new one. - LookupNew: retrieves the date of the last update (using a concatenation of two columns). My incremental copy activity has a destination folder where TXT files with the suffix "incremental" are generated. However, I have a problem: I can't update the Old date at the end of the pipeline cycle. Although I could do this with a Stored Procedure, I always receive the following error: "Execution fail against SQL Server. Please contact SQL Server team if you need further support. Sql error number: 24559. Error Message: Data Manipulation Language (DML) statements are not supported for this table type in this version of SQL Server." I understand that we cannot perform DML, but it also doesn't allow me to save the variable. I'm looking for an alternative to update Old with the update date so I can achieve a difference. My goal is to achieve an incremental refresh based on the difference between the lookups. I’ve been following a tutorial and have tried saving variables and creating notebooks to manage the process, but I still haven’t found a suitable solution for my context. Incrementally load data from Data Warehouse to Lakehouse - Microsoft Fabric | Microsoft Learn Does anyone have any suggestions? I appreciate any help!🫡
0
0
Incremental Refresh Pipeline
1-30 of 1,059
Learn Microsoft Fabric
skool.com/microsoft-fabric
Advance your data career by learning the hottest new data analytics platform🔥Don't learn alone, learn in our friendly community of Fabricators!
Leaderboard (30-day)
powered by