Q: 7
You have a Fabric tenant that contains a semantic model. The model uses Direct Lake mode.
You suspect that some DAX queries load unnecessary columns into memory.
You need to identify the frequently used columns that are loaded into memory.
What are two ways to achieve the goal? Each correct answer presents a complete solution.
NOTE: Each correct answer is worth one point.
Options
Discussion
B and C tbh. A looks tempting but doesn't give the column usage details you need, pretty sure that's a common trap.
Recommendation from most guides and official docs points to B and C. Vertipaq Analyzer and the specific DMV help analyze memory footprint per column. Anyone prepping, definitely practice with both tools for these scenarios.
B and C. Saw nearly the same scenario in a mock, these match what you'd use for Direct Lake analytics. Not 100% sure but this lines up with what I remember.
Be respectful. No spam.