Option A is right, it's UTF-8 by default for CSV loads into Snowflake. Pretty standard for handling various languages and symbols out of the box. Don't think any of the others are even supported as defaults. Let me know if anyone's seen different behavior though.
B C, D is the right combo here. Users (A) just have metadata, their storage footprint isn't actually listed in Information Schema or Account Usage views. Snowflake gives real storage stats for tables, databases, and internal stages. I think some might mix up metadata with storage reporting, so that's the trap option. Open to other takes if I'm missing a recent change!
Option C Eliminating resource contention is the big deal here since virtual warehouses let different workloads run without impacting each other. That's what separates Snowflake's architecture from others I think. If someone sees it differently I'd be curious to hear why!
Definitely C here. Snowpipe keeps data load history for 64 days, it's hardcoded, not something you set when creating the pipe. Saw a similar question on a practice test. Someone correct me if this changed but pretty sure that's still current.
I’d go with D since MCW auto-scales to handle concurrency with scheduled tasks, so you don’t need to worry about sizing for overlapping runs. That matches what’s in the official guide and most practice tests. Not 100% if they try to trick us with single-instance scenarios, but D seems safest overall. Agree?
D . MCW lets Snowflake auto-scale the warehouse to handle peak concurrency for scheduled tasks, so you don't need to size it manually for overlapping runs. The other answers focus too much on single runs or stream size, which doesn't cover all cases. Pretty sure D is what they're after but happy to hear other takes if I missed a use case.
Option D looks right since MCW in Snowflake automatically manages concurrency for tasks, so you don’t have to guess the warehouse size every time tasks might overlap. B feels less complete honestly, because stream content size doesn’t cover parallel task execution. I think D fits the typical scenario, but not 100% if there’s some edge case I’m missing. Agree?
Feels like it's A, that's the recommended file size range from practice tests and the official exam guide. The other options reflect max limits but the question asks for best organization. Someone correct me if system limit is really what Snowflake tests for here?