Activity
Mon
Wed
Fri
Sun
Oct
Nov
Dec
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 4.2k • Free

10 contributions to Learn Microsoft Fabric
NB Issue writing to delta, help!
Ive had this issue on an off for a while and cant figure it out, ive tried using: df.write.format("delta").mode.("overwrite").save("File/path") and I get the result in the Screen shot. When I use .SaveAsTable, I get an error. cant figure it out! Surely writing as a delta either works or it doesnt, I still get a delta log and a parquet, but its like its not recognised?? has anyone come across this and know more from the error response? bonus question, where do you go for diagnosing errors? thanks!
1
7
New comment 3d ago
NB Issue writing to delta, help!
0 likes • 6d
@Sairam K thanks, are you doing this to the unidentified table or as part of the write process?
0 likes • 6d
@Will Needham aha! great advice will, i thought it was not a very “factory” behaviour. i’ll give that a try too. many thanks
Notebooks in Fabric
I have a notebook that outputs I would like it to be the outputs to go a table that I can leverage in a powerbi model have anyone done that in fabric that can give me some tips ?
1
7
New comment 29d ago
2 likes • 30d
https://radacad.com/power-bi-default-semantic-model-or-custom-a-guide-for-using-in-fabric-environment#:~:text=Custom%20Semantic%20Model,them%20to%20this%20semantic%20model
0 likes • 29d
@Liliana Torres it depends if you are adding new tables or perhaps easier is having the tables there and then appending the new data, if it’s new tables i’m sure there is a way to add automatically to the default model. (someone might correct me this is already the case) but i would try to append
DP-600 Pass!
After a failed virtual attempt I went the in person route and got a different result! but the good news is im part of the DP-600 crew. thanks again for all the support Will and community! 🦄 Fabricorn!
10
5
New comment 29d ago
1 like • 29d
The first time I sat the test I spent quite a bit of time pouring over the case study only to realise that each time you are asked a question you can revert to the text to see what you need for your answer so I would suggest skim reading over the initial case study knowing you can revisit it at any point when answering the questions, aside from that semantic models, calculation groups, dax, dynamic strings and a few key things that have become recent features and the rest is just study! I didn’t know sql well so I had to brush up on some of the t-sql statements that are regularly used in fabric
PowerBI recommendations
Hello, I´ve been going through the learning path on the MS learning for the certification. It was going quite nicely until in the last part I realized the path actually assumes I know something about PowerBI. The thing is, I really don´t know about it much and did not realize this when starting out. I have some experience of Tableau so I´m not completely beginner with BI tools. What level of knowledge should I have to comfortably to be able to pass e.g. the DAX questions? Do you have some suggestions for e.g. MS learn modules that suites this purpose? Considering getting the PL-300 cert at some point but I´m focusing on my data transformation and modeling skills for now (and my Fabric trial is running) so not going to go through process yet.
5
14
New comment 6d ago
1 like • 29d
If you focus in on the particular DAX and power BI they have in questions and are proficient in the other modules you should be ok, just make sure you understand some of the filter context stuff. Calculate, Filter, CreateTable etc, and then know dynamic strings, calculation groups etc. its all in the guides.
Triggering Fabric Pipelines
Our current strategy is to use Databricks as a 'data engine' bringing relevant data in to Databricks before short cutting it to Fabric (Bronze) for subsequent processing and consumption. My question is there a way to orchestrate the start Fabric data pipeline after Databricks job of moving the raw data to bronze finishes. It might seem a little unconventional to use Databricks to extract the data initially but that is direction we have adopted, any guidance on how we might trigger the pipeline (in Fabric) after the Databricks extraction would be much appreciated.
1
2
New comment 30d ago
0 likes • 30d
you can set a scheduled run at the moment, more triggers are being added
1-10 of 10
Ross Garrett
2
3points to level up
@ross-garrett-5838
Data Analyst, engineer with a few. years experience, but really still learning. Looking forward to. working with fabric more when I have full access.

Active 3d ago
Joined Jun 3, 2024
ENFJ
Canberra, Australia
powered by