Wait, wouldn't this break if the program uses just SQL and no Python kernel? If they can't move it to PySpark, option B wouldn't apply. Are we sure Python is allowed here, or is there a native SQL scheduling trick in Databricks I missed?
Had something like this in a mock and B worked best. Wrapping SQL in PySpark means you can use Python's datetime to check the day and control that last query easily. Pretty sure that's the expected solution, but open if someone disagrees.
B for sure. Wrapping with PySpark lets you use Python's datetime to check if it's Sunday, so you can conditionally run the last query. All the other options either restrict access or change what runs each day, which isn't what's needed here. Pretty confident on this but open to other ideas if I missed something.
Not D, B. Wrapping with PySpark lets you control exactly when the last query runs, unlike D which just restricts access (not really what’s needed here). If the requirement was to only run the program on Sundays, C would make sense. Was the daily schedule a must?