Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
What is this?
Less
More

Memberships

Learn Microsoft Fabric

Public • 5.6k • Free

6 contributions to Learn Microsoft Fabric
Open Opportunities
Hey folks. My company is hiring for TWO reporting positions where Fabric would be a major plus. Target start date would be late January. Applicants must be legally authorized to work in the United States without requiring sponsorship for employment visa status (e.g., H1-B status) now or in the future. The job listings are for Hybrid work in the St. Louis, Missouri, USA area (2 days a week in-office). If you FAR exceed the requirements and don't live near St. Louis (and aren't willing to relocate) message me on LinkedIn and we can chat. I report to the hiring manager. We're presently transitioning from Azure Synapse Analytics and AAS over to Fabric. That being said, understanding Fabric's semantic models, Direct Lake, CI/CD, etc. will be a major plus. DP-600 would look great! Here's one of the positions, BI Reporting Solution Lead: https://www.linkedin.com/jobs/view/4082188724/?refId=PUG7m619AKoLajwN%2BbPm6w%3D%3D&trackingId=PUG7m619AKoLajwN%2BbPm6w%3D%3D With this he's basically looking for someone with 5+ years of PBI experience who can interface with the users to refine reporting requirements, as well as review other PBI developers' work. That being said, a seasoned PBI Developer with some business analyst background would be ideal. You'd lead two other more junior PBI Developers. The other position isn't posted on LinkedIn yet but will require 3+ years of experience and will be a traditional Power BI Developer, more of a "here are your stories, make the reports". My LinkedIn: https://www.linkedin.com/in/tonykain/
1
0
Passed DP-600, Advice!
Howdy fellow Fabricators! I'm happy to report that I passed the DP-600 exam on the first try after two months of studying 15-30 minutes per day. My background is Data Engineering in Azure Synapse. Here are my takeaways. 1) Will's videos are a great base! 2) The practice tests from Microsoft are MUCH easier than the actual exam! Do not get complacent if you're doing well on those. 3) #1 BIGGEST TAKEAWAY: Learn how to Navigate Microsoft Learn! Seriously, this saved me, no doubt. I spent about an hour just learning how Microsoft Learn is structured and how to navigate from one set of documentation to the other. You need to learn how to navigate to Fabric documentation, PBI, DAX, M, and PowerQuery. You can go to Databricks for any PySpark you might not understand, too. You also need to understand the major keywords you'll need to search for based on the questions. 4) The was MUCH more about DAX than I expected, probably 6 questions. Same with M, probably 4 questions. Power Query was a solid 8 questions, with 3 of them being about data profiling. 5) Understand was query folding is important to know (or at least that it's part of Power Query so you search for it) 6) With the case study! You need to IMMEDIATELY know that there are a bunch of tabs that you need to review for each question. For example, the question will reference "Data requirements" but not state what they are. That's because they're in the data requirements tab. Will Needham! If you read this, I think a VERY valuable video would be to show yourself going through a few practice test questions and showing how to find the answers in MS Learn. I'm happy to collaborate with you on that if you'd like.
10
4
New comment 9d ago
0 likes • 9d
@Will Needham That's a good point. I hadn't thought of people struggling with time management using it. I know I answered at least 30 questions in less than a minute, leaving me a lot of time for the questions I wasn't so sure about. Even those, I at least had a concept of what the questions were asking, so finding the answers in MS Learn wasn't too time-consuming. That being said, I definitely see that it would be a good strategy to use the "Review" flag for questions you know you'll need to look up, the circling back to those after getting to the end.
[DP-600] Share your top tips for the exam
👇 Use this thread to share your top tips for the DP-600 exam, like I did in this lesson: #BONUS: TOP TIPS for the exam
6
7
New comment 10d ago
5 likes • 10d
#1 Learning how to navigate Microsoft Learn! If you can't remember certain specifics about things like roles or functions, you can find it there. #2 Use practice tests beyond just Microsoft's. The MS practice tests are way too easy.
Accessing azure key vault into MS fabric
Hello mates, I have got a requirement, Currently the client data is in on-prem third party DB and they want to Move the data to Data Bricks. But when we tried to convince them for Microsoft Fabric they mentioned that they do not want to move to Fabric as Keyvault is not supported in MS Fabric. I have explored the options w.r.t azure databricks and MS fabric. Below are my findings. - The secrets in azure key vault can directly access in azure databricks by creating secret scope in databricks with proper permission. Since the secrets are stored in azure key vault, it’s more secure and it is centralised secrets management system. - When it comes to MS fabric.it doesn’t support direct integration with azure key vault. As an alternate options. - 1. Use Microsoft Fabric’s Built-in Secrets Management. Where you can store the secrets directly in fabric environment and access when needed. - 2. Store the secrets in azure key vault and write the custom script to access the secrets into MS fabric. Apart from the above mentioned options. Can someone please help me with proper approach to provide the right solution for the Client requirement to promote MS fabric 😊
2
6
New comment 8d ago
2 likes • 11d
On 9/9 we had a Microsoft Product Manager who works with Alex Powers tell us that Key Vault was in private preview. That's all I got :( I haven't tested it in Fabric, but I used a workaround a few months back in Synapse that worked like this in a pipeline: Web Activity URL: 'https://' + key_vault + '.vault.azure.net/secrets/' + secret_name + '?api-version=7.0' Method: GET Authentication: System-assigned managed identity Resource: https://vault.azure.net
Spark in Microsoft Fabric... some doubts
I'm taking the 30-day Spark course, but I still have a few doubts, which are as follows: DataFrame filtering To read the contents of a table into a dataframe we can write: df = spark.sql( “SELECT * FROM SparkSetember.propertysales LIMIT 1000” Subsequently, we can, for example, Filter the table with the function: df.filter(df.City.startswith(“L”)).show() My question is this: Why don't we do it straight away: df = spark.sql( “SELECT * FROM SparkSetember.propertysales WHERE Citi Like ‘L%’ ”
2
4
New comment Oct 13
2 likes • Oct 11
Joao. I just switched over to Spark about a year ago and definitely had a lot of similar questions. Will is right, it's a matter of preference in this case. The lazy evaluation will give the same results in both cases. I believe the point here is to first bring in any set of values from the table, then filter just to demonstrate how to filter. You're correct though: filter as early as possible under normal circumstances. I'll leave you with this nugget, too! One note on spark.sql statements that a lot of people don't point out in training videos is that if you want to expand the query to multiple lines like you would in native SQL environments, you can use a triple quote at the beginning and end of the query. You can even indent like you normally would to make the code look nice! df = spark.sql(""" SELECT * FROM SparkSeptember.propertysales WHERE City Like 'L%' """) This significantly improve readability and makes most SQL devs feel more at home.
1-6 of 6
Anthony Kain
3
43points to level up
@anthony-kain-6916
Lead Data Engineer with 5 heavy years of experience in TSQL and PySpark. Connect with me on LinkedIn: https://www.linkedin.com/in/tonykain/

Active 3h ago
Joined Sep 9, 2024
INTJ
St. Louis, MO
powered by