Microsoft AZ-305 Exam Questions 2025

Updated:

Our AZ-305 Exam Questions give you thorough, real exam practice for the Microsoft Azure Solutions Architect Expert certification. Each question is reviewed by Azure architects and includes verified answers with clear explanations and references, covering design, security, and governance. Try free sample questions and our online simulator to prepare confidently with Cert Empire.

 

Exam Questions

Question 1

You plan to deploy multiple instances of an Azure web app across several Azure regions. You need to design an access solution for the app. The solution must meet the following replication requirements: • Support rate limiting • Balance requests between all instances. • Ensure that users can access the app in the event of a regional outage Solution: You use Azure Load Balancer to provide access to the app. Does this meet the goal?
Options
A: Yes
B: No
Show Answer
Correct Answer:
No
Explanation
The proposed solution is incorrect. Azure Load Balancer is a regional, Layer 4 (TCP/UDP) service. It cannot meet the specified requirements. Firstly, it does not natively support rate limiting, which is a Layer 7 (HTTP/S) feature typically handled by services like Azure Application Gateway WAF or Azure Front Door. Secondly, as a regional service, a standard Azure Load Balancer cannot balance traffic across multiple Azure regions or provide automatic failover in the event of a regional outage. A global load balancing solution, such as Azure Front Door or Azure Traffic Manager, is required to route traffic across regions and ensure high availability during a regional failure.
Why Incorrect Options are Wrong

A. Yes: This is incorrect because Azure Load Balancer is a regional Layer 4 service and lacks the required global routing, regional failover, and native rate-limiting capabilities.

References

1. Azure Architecture Center - Load-balancing options. This document explicitly states, "For global routing, we recommend Azure Front Door." It also categorizes Azure Load Balancer as a Regional load balancer, contrasting it with Global options like Front Door and Traffic Manager, which are necessary for regional outage scenarios.

Source: Microsoft Learn, Azure Architecture Center. (2023). Load-balancing options. Section: "Azure load-balancing services".

2. Azure Load Balancer overview. This documentation confirms that "Azure Load Balancer operates at layer 4 of the Open Systems Interconnection (OSI) model" and is a regional resource, which means it cannot route traffic between regions.

Source: Microsoft Learn. (2023). What is Azure Load Balancer?. Section: "Introduction".

3. Web Application Firewall (WAF) rate limiting. This document details how rate limiting is a feature of Azure Application Gateway WAF and Azure Front Door, not Azure Load Balancer. It states, "Rate limiting allows you to detect and block abnormally high levels of traffic from any client IP address."

Source: Microsoft Learn. (2023). Rate limiting on Azure Application Gateway. Section: "Overview".

Question 2

You are developing a sales application that will contain several Azure cloud services and handle different components of a transaction. Different cloud services will process customer orders, billing, payment inventory, and shipping. You need to recommend a solution to enable the cloud services to asynchronously communicate transaction information by using XML messages. What should you include in the recommendation?
Options
A: Azure Data Lake
B: Azure Notification Hubs
C: Azure Queue Storage
D: Azure Service Fabric
Show Answer
Correct Answer:
Azure Queue Storage
Explanation
Azure Queue Storage is a service for storing large numbers of messages that can be accessed from anywhere in the world. It is designed for building scalable, decoupled applications. In this scenario, the different cloud services (orders, billing, inventory) can communicate asynchronously by placing XML messages into a queue. The sending service adds a message and can continue its work, while the receiving service can retrieve and process the message when it is ready. This pattern effectively decouples the components, improving the application's overall reliability and scalability, which is ideal for handling different stages of a transaction.
Why Incorrect Options are Wrong

A. Azure Data Lake is a scalable data storage and analytics service. It is designed for big data workloads, not for real-time, transactional messaging between services.

B. Azure Notification Hubs is a massively scalable mobile push notification engine. Its purpose is to send notifications to client applications on various platforms, not for backend service-to-service communication.

D. Azure Service Fabric is a distributed systems platform for building and deploying microservices. While you could build a messaging system on it, it is not the messaging service itself.

References

1. Microsoft Documentation, "What is Azure Queue Storage?": "Azure Queue Storage is a service for storing large numbers of messages. You access messages from anywhere in the world via authenticated calls using HTTP or HTTPS. A queue message can be up to 64 KB in size. A queue may contain millions of messages, up to the total capacity limit of a storage account. Queues are commonly used to create a backlog of work to process asynchronously."

Source: Microsoft Docs, Azure Storage Documentation, Queues.

2. Microsoft Documentation, "Storage queues and Service Bus queues - compared and contrasted": "Azure Queue Storage... provides a simple REST-based Get/Put/Peek interface, providing reliable, persistent messaging within and between services... Use Queue storage when you need to store over 80 gigabytes of messages in a queue [and] you want a simple, easy to use queue." This document highlights its use for decoupling application components for increased scalability and reliability.

Source: Microsoft Docs, Azure Architecture Center, Application integration.

3. Microsoft Documentation, "What is Azure Notification Hubs?": "Azure Notification Hubs provide an easy-to-use and scaled-out push engine that allows you to send notifications to any platform (iOS, Android, Windows, etc.) from any back-end (cloud or on-premises)."

Source: Microsoft Docs, Azure Notification Hubs Documentation, Overview.

4. Microsoft Documentation, "Introduction to Azure Data Lake Storage Gen2": "Azure Data Lake Storage Gen2 is a set of capabilities dedicated to big data analytics, built on Azure Blob Storage."

Source: Microsoft Docs, Azure Storage Documentation, Data Lake Storage.

Question 3

Your company has the divisions shown in the following table. AZ-305 exam question Sub1 contains an Azure App Service web app named App1. Appl uses Azure AD for single-tenant user authentication. Users from contoso.com can authenticate to App1. You need to recommend a solution to enable users in the fabrikam.com tenant to authenticate to App1. What should you recommend?
Options
A: Configure the Azure AD provisioning service.
B: Configure Supported account types in the application registration and update the sign-in endpoint.
C: Configure assignments for the fabrikam.com users by using Azure AD Privileged Identity Management (PIM).
D: Enable Azure AD pass-through authentication and update the sign-in endpoint
Show Answer
Correct Answer:
Configure Supported account types in the application registration and update the sign-in endpoint.
Explanation
The application is currently configured as a single-tenant app, which restricts authentication to users within its home tenant (contoso.com). To allow users from an external Azure AD tenant (fabrikam.com) to authenticate, the application must be reconfigured to be multi-tenant. This is accomplished by modifying the "Supported account types" setting in the application's registration within the Azure portal. Changing this setting to "Accounts in any organizational directory (Any Azure AD directory - Multitenant)" makes the application available to users from any Azure AD tenant. The application's sign-in endpoint logic must also be updated to handle requests from the generic /organizations or /common endpoint instead of the tenant-specific one.
Why Incorrect Options are Wrong

A. The Azure AD provisioning service automates creating and managing user identities in other applications; it does not configure an application's authentication audience.

C. Azure AD Privileged Identity Management (PIM) is used to manage, control, and monitor access to privileged roles, not to enable standard cross-tenant user authentication.

D. Azure AD pass-through authentication is a sign-in method for hybrid identity that validates user passwords against an on-premises Active Directory; it is not relevant for cross-tenant authentication.

References

1. Microsoft Documentation: How to: Sign in any Azure Active Directory user using the multi-tenant application pattern.

Reference: In the section "Update the registration to be multi-tenant," the document states: "If you have an existing application and you want to make it multi-tenant, you need to open the application registration in the Azure portal and update Supported account types to Accounts in any organizational directory." This directly supports the chosen answer.

2. Microsoft Documentation: Quickstart: Register an application with the Microsoft identity platform.

Reference: In the "Register an application" section, step 4, "Supported account types," explicitly defines the option "Accounts in any organizational directory (Any Azure AD directory - Multitenant)" as the method to allow users with a work or school account from any organization to sign into the application.

3. Microsoft Documentation: Tenancy in Azure Active Directory.

Reference: The "App-level considerations" section explains the difference between single-tenant and multi-tenant applications. It clarifies that a multi-tenant application is "available to users in both its home tenant and other tenants." This conceptual document underpins the need to change the application's tenancy model to meet the requirement.

Question 4

You need to design a highly available Azure SQL database that meets the following requirements: * Failover between replicas of the database must occur without any data loss. * The database must remain available in the event of a zone outage. * Costs must be minimized. Which deployment option should you use?
Options
A: Azure SQL Database Premium
B: Azure SQL Database Hyperscale
C: Azure SQL Database Basic
D: Azure SQL Managed Instance Business Critical
Show Answer
Correct Answer:
Azure SQL Database Premium
Explanation
The Azure SQL Database Premium tier is the most appropriate choice. It supports zone-redundant configurations, which provision replicas in different availability zones within the same region. This architecture uses synchronous replication, ensuring that failovers occur with zero data loss (Recovery Point Objective - RPO=0) and that the database remains available during a zone-level outage. Compared to Hyperscale and Managed Instance Business Critical, the Premium tier provides these high-availability features at a lower cost, thus satisfying the "costs must be minimized" requirement for workloads that do not require the massive scale of Hyperscale or the instance-level features of Managed Instance.
Why Incorrect Options are Wrong

B. Azure SQL Database Hyperscale: While it supports zone redundancy, this tier is designed for very large databases (VLDBs) and is not the most cost-effective option for general high-availability scenarios.

C. Azure SQL Database Basic: This tier does not support zone-redundant configurations and cannot meet the requirement to remain available during a zone outage.

D. Azure SQL Managed Instance Business Critical: This option meets the availability and data-loss requirements but is generally more expensive than Azure SQL Database Premium, failing the cost-minimization constraint.

References

1. Microsoft Documentation, "High availability for Azure SQL Database and SQL Managed Instance": Under the "Zone-redundant availability" section, it states, "Zone-redundant configuration is available for databases in the... Premium, Business Critical, and Hyperscale service tiers... When you provision a database or an elastic pool with zone redundancy, Azure SQL creates multiple synchronous secondary replicas in other availability zones." This confirms that Premium meets the zone outage and no data loss requirements.

2. Microsoft Documentation, "vCore purchasing model - Azure SQL Database": The "Premium service tier" section describes it as being designed for "I/O-intensive workloads that require high availability and low-latency I/O." The documentation confirms that zone redundancy is a configurable option for this tier.

3. Microsoft Documentation, "Service Tiers in the DTU-based purchase model": This document shows that the Basic tier has a "Basic availability" model with a single database file and is not designed for high availability or zone redundancy.

4. Microsoft Documentation, "Compare the vCore and DTU-based purchasing models of Azure SQL Database": This page highlights that the Premium tier (in both models) is designed for high performance and high availability, whereas Managed Instance is for "lift-and-shift of the largest number of SQL Server applications to the cloud with minimal changes," which often comes at a higher price point.

Question 5

DRAG DROP You have an on-premises named App 1. Customers App1 to manage digital images. You plan to migrate App1 to Azure. You need to recommend a data storage solution for Appl. The solution must meet the following image storage requirements: Encrypt images at rest. Allow files up to 50M AZ-305 exam question

Show Answer
Correct Answer:

IMAGE STORAGE: AZURE BLOB STORAGE

CUSTOMER ACCOUNTS: AZURE SQL DATABASE

Explanation

Azure Blob storage is the optimal choice for image storage. It's specifically designed to store massive amounts of unstructured data, such as images, videos, and documents. It easily accommodates files up to 50 MB and provides server-side encryption by default, satisfying both requirements. Storing large binary files directly in a database is generally inefficient and not recommended.


Azure SQL Database is the most suitable service for customer accounts. Customer account data is typically structured and relational (e.g., user ID, name, email, password). As a fully managed relational database-as-a-service, Azure SQL Database provides transactional consistency, data integrity, and robust querying capabilities, which are essential for managing user account information effectively.

References

Azure Blob Storage Documentation: Microsoft's official documentation states that Azure Blob storage is optimized for storing massive amounts of unstructured data. Common use cases include "Serving images or documents directly to a browser" and "Storing files for distributed access."

Source: Microsoft Docs, "Introduction to Azure Blob storage," Use cases section.

Azure SQL Database Documentation: The official documentation describes Azure SQL Database as a fully managed relational database service built for the cloud. It is ideal for applications that require a relational data model with transactional consistency and data integrity, making it a standard choice for storing structured data like user profiles and customer accounts.

Source: Microsoft Docs, "What is Azure SQL Database?," Overview section.

Comparison of Azure Storage Options: Microsoft's "Choose a data storage approach in Azure" guide recommends Blob storage for "images, videos, documents...large binary objects" and relational databases like Azure SQL Database for "transactional data" and data requiring a "high degree of integrity," such as customer information.

Source: Microsoft Azure Architecture Center, "Choose a data storage approach in Azure," Relational databases and Blob storage sections.

Question 6

You have a multi-tier app named Appl and an Azure SQL database named SQL l. The backend service Of Appl writes data to Users use the Appl client to read the data from SQL 1. During periods of high utilization the users experience delays retrieving the data. You need to minimize how long it takes for data requests. What should you include in the solution?
Options
A: Azure Synapse Analytics
B: Azure Content Delivery Network (CON)
C: Azure Data Factory
D: Azure Cache for Redis
Show Answer
Correct Answer:
Azure Cache for Redis
Explanation
The scenario describes read-latency issues with an Azure SQL database during periods of high utilization. Azure Cache for Redis is an in-memory data store that provides a high-throughput, low-latency caching solution. By implementing a caching layer with Redis, frequently accessed data can be stored in memory. When the application requests data, it first checks the Redis cache. If the data is present (a cache hit), it is returned immediately, avoiding a slower query to the SQL database. This significantly reduces data retrieval times for users and lessens the load on the database, directly addressing the performance bottleneck.
Why Incorrect Options are Wrong

A. Azure Synapse Analytics is a large-scale data warehousing and big data analytics service, not designed for low-latency transactional application caching.

B. Azure Content Delivery Network (CDN) is used to cache static web content (like images and scripts) at edge locations, not dynamic data from a database.

C. Azure Data Factory is a cloud-based data integration (ETL/ELT) service for orchestrating data movement and transformation, not for real-time application performance improvement.

References

1. Microsoft Documentation, Azure Cache for Redis. "What is Azure Cache for Redis?". Under the section "Common scenarios," the first listed scenario is "Data cache." It states, "It's a common technique to cache data in-memory... to improve the performance of an application. Caching with Azure Cache for Redis can increase performance by orders of magnitude."

2. Microsoft Documentation, Azure Architecture Center. "Cache-Aside pattern". This document describes the exact pattern for solving the problem in the question: "Load data on demand from a data store into a cache. This can improve performance and also helps to maintain consistency between data held in the cache and data in the underlying data store."

3. Microsoft Documentation, Azure Synapse Analytics. "What is Azure Synapse Analytics?". The overview clearly defines it as "a limitless analytics service that brings together data integration, enterprise data warehousing, and big data analytics." This is distinct from an application performance cache.

Question 7

You need to design a highly available Azure SQL database that meets the following requirements: Failover between replicas of the database must occur without any data loss. The database must remain available in the event of a zone outage. Costs must be minimized Which deployment option should you use?
Options
A: Azure SQL Database Standard
B: Azure SQL Database Serverless
C: Azure SQL Managed Instance General Purpose
D: Azure SQL Database Premium
Show Answer
Correct Answer:
Azure SQL Database Serverless
Explanation
The solution requires availability during a zone outage, no data loss on failover, and minimal cost. The Azure SQL Database Serverless compute tier, which is part of the General Purpose service tier, meets all these requirements. It supports a zone-redundant configuration that synchronously replicates data across multiple availability zones within a region, ensuring both high availability and zero data loss (RPO=0). Compared to the Premium tier, which also offers zone redundancy, the General Purpose/Serverless tier is the more budget-oriented option, thus satisfying the requirement to minimize costs.
Why Incorrect Options are Wrong

A. Azure SQL Database Standard: This service tier does not support zone-redundant configurations and cannot meet the requirement for availability during a zone outage.

C. Azure SQL Managed Instance General Purpose: This service tier does not support zone redundancy. Only the Business Critical tier for SQL Managed Instance offers this capability.

D. Azure SQL Database Premium: While this tier supports zone redundancy and ensures no data loss, it is more expensive than the Serverless/General Purpose tier, failing the cost minimization requirement.

References

1. Microsoft Learn | High availability for Azure SQL Database and SQL Managed Instance: Under the "Zone-redundant availability" section, it states, "Zone-redundant availability is available for databases in the General Purpose, Premium, Business Critical, and Hyperscale service tiers." It also explicitly states, "Zone redundancy for the serverless compute tier of the General Purpose service tier is generally available." This confirms that Serverless (B) and Premium (D) support zone redundancy, while Managed Instance General Purpose (C) does not.

2. Microsoft Learn | vCore purchasing model overview - Azure SQL Database: This document compares the service tiers. The "General Purpose service tier" section describes it as a "budget-oriented" option suitable for "most business workloads." The "Premium service tier" is described as being for "I/O-intensive production workloads." This supports the choice of a General Purpose-based option (Serverless) for cost minimization over Premium.

3. Microsoft Learn | Serverless compute tier for Azure SQL Database: This document details the cost model for Serverless, stating it "bills for the amount of compute used per second." This model is designed to optimize costs, particularly for workloads with intermittent usage patterns, reinforcing its position as the most cost-effective choice among the zone-redundant options.

Question 8

You have an on-premises Microsoft SQL server named SQLI that hosts 50 databases. You plan to migrate SQL 1 to Azure SQL Managed Instance. You need to perform an offline migration of SQL 1. The solution must minimize administrative effort. What should you include in the solution?
Options
A: SQL Server Migration Assistant (SSMA)
B: Azure Migrate
C: Data Migration Assistant (DMA)
D: Azure Database Migration Service
Show Answer
Correct Answer:
Azure Database Migration Service
Explanation
Azure Database Migration Service (DMS) is a fully managed service designed to enable seamless, large-scale database migrations to Azure data platforms. For an offline migration of 50 databases from an on-premises SQL Server to Azure SQL Managed Instance, DMS provides an orchestrated and resilient workflow. It can use native full database backups stored in Azure Blob Storage to restore the databases to the target instance. This approach is highly efficient, scalable for many databases, and significantly minimizes the administrative effort required compared to using standalone tools for each database.
Why Incorrect Options are Wrong

A. SQL Server Migration Assistant (SSMA): SSMA is primarily for assessing and migrating from heterogeneous (non-SQL) database sources like Oracle or DB2 to SQL Server or Azure SQL, not for SQL-to-SQL migrations.

B. Azure Migrate: Azure Migrate is a central hub for discovery, assessment, and migration planning. For the actual database migration execution, it integrates with and uses Azure Database Migration Service (DMS).

C. Data Migration Assistant (DMA): DMA is primarily an assessment tool to identify compatibility issues. While it can perform small-scale migrations, it is not designed for orchestrating the migration of many databases, which would increase administrative effort.

References

1. Azure Database Migration Service Documentation: "Tutorial: Migrate SQL Server to Azure SQL Managed Instance offline using DMS". This official tutorial explicitly states, "You can use Azure Database Migration Service to migrate the databases from an on-premises SQL Server instance to an Azure SQL Managed Instance." It details the offline migration process using native backups, which is the scenario described.

Source: Microsoft Docs, "Tutorial: Migrate SQL Server to Azure SQL Managed Instance offline using DMS", Prerequisites section.

2. Azure Database Migration Service Overview: "Azure Database Migration Service is a fully managed service designed to enable seamless migrations from multiple database sources to Azure Data platforms with minimal downtime." This highlights its role as a managed, orchestrated service, which aligns with minimizing administrative effort.

Source: Microsoft Docs, "What is Azure Database Migration Service?", Overview section.

3. Data Migration Assistant (DMA) Documentation: "Data Migration Assistant (DMA) helps you upgrade to a modern data platform by detecting compatibility issues that can impact database functionality... After assessing, DMA helps you migrate your schema, data, and uncontained objects from your source server to your target server." This positions DMA as an assessment tool with migration capabilities, but not as the primary orchestration service for large-scale migrations like DMS.

Source: Microsoft Docs, "Overview of Data Migration Assistant", Introduction section.

Question 9

HOTSPOT

-


You have an app that generates 50,000 events daily.


You plan to stream the events to an Azure event hub and use Event Hubs Capture to implement cold path processing of the events. The output of Event Hubs Capture will be consumed by a reporting system.


You need to identify which type of Azure storage must be provisioned to support Event Hubs Capture, and which inbound data format the reporting system must support.


What should you identify? To answer, select the appropriate options in the answer area.

Answer
Show Answer
Correct Answer:

STORAGE TYPE: AZURE DATA LAKE STORAGE GEN2

DATA FORMAT: AVRO

Explanation

Azure Event Hubs Capture automatically archives streaming data to a user-specified storage container. This feature supports either an Azure Blob Storage or an Azure Data Lake Storage Gen2 account for storing the captured data. Therefore, Azure Data Lake Storage Gen2 is a valid storage type to provision.


The data is always written in the Apache Avro format, which is a compact, fast, binary format that includes the schema inline. Consequently, any downstream reporting system consuming the data from the capture destination must be able to read and process files in the Avro format.

References

Microsoft Azure Documentation, "Overview of Event Hubs Capture."

Section: Introduction

Content: "Event Hubs Capture enables you to automatically deliver the streaming data in Event Hubs to an Azure Blob storage or Azure Data Lake Storage account of your choice... Captured data is written in Apache Avro format: a compact, fast, binary format that provides rich data structures with inline schema."

Microsoft Azure Documentation, "Capture streaming events using the Azure portal."

Section: Enable Event Hubs Capture

Content: "For Capture provider, select Azure Storage Account... Event Hubs writes the captured data in Apache Avro format." This section details the configuration where the user must select a compatible storage account type.

Question 10

You are designing an app that will include two components. The components will communicate by sending messages via a queue. You need to recommend a solution to process the messages by using a First in. First out (FIFO) pattern. What should you include in the recommendation?
Options
A: storage queues with a custom metadata setting
B: Azure Service Bus queues with sessions enabled
C: Azure Service Bus queues with partitioning enabled
D: storage queues with a stored access policy
Show Answer
Correct Answer:
Azure Service Bus queues with sessions enabled
Explanation
Azure Service Bus is the appropriate service for scenarios requiring guaranteed First-In, First-Out (FIFO) message ordering. While a standard Service Bus queue does not guarantee FIFO when multiple competing consumers are present, enabling the sessions feature does. Message sessions group a sequence of related messages, and a session-aware receiver locks the session, ensuring all messages from that specific session are processed in the order they were sent by a single consumer. This provides a strict, ordered handling of messages, fulfilling the FIFO requirement.
Why Incorrect Options are Wrong

A. storage queues with a custom metadata setting: Azure Storage Queues are designed for high-throughput and do not guarantee FIFO ordering. Custom metadata is for annotating queues and does not influence message processing order.

C. Azure Service Bus queues with partitioning enabled: Partitioning is a feature for increasing throughput and availability by distributing the queue across multiple message brokers. It can disrupt strict ordering unless used in conjunction with sessions.

D. storage queues with a stored access policy: A stored access policy is a security mechanism for managing access permissions via Shared Access Signatures (SAS) and has no impact on the message delivery order.

---

References

1. Microsoft Azure Documentation, "Message sessions": "To realize a FIFO guarantee in Service Bus, use sessions. Message sessions enable joint and ordered handling of unbounded sequences of related messages." (Section: "Message sessions", Paragraph 1).

2. Microsoft Azure Documentation, "Storage queues and Service Bus queues - compared and contrasted": "Service Bus sessions enable you to process messages in a first-in, first-out (FIFO) manner... Azure Storage Queues don't natively support FIFO ordering." (Section: "Feature comparison", Table Row: "Ordering").

3. Microsoft Azure Documentation, "Partitioned messaging entities": "When a client sends a message to a partitioned queue or topic, Service Bus checks for the presence of a partition key. If it finds one, it selects the partition based on that key... If a partition key isn't specified but a session ID is, Service Bus uses the session ID as the partition key." This highlights that partitioning alone doesn't guarantee order; it's the session ID that ensures related messages land on the same partition to maintain order. (Section: "Use of partition keys").

Sale!
Total Questions319
Last Update Check October 03, 2025
Online Simulator PDF Downloads
50,000+ Students Helped So Far
$30.00 $60.00 50% off
Rated 5 out of 5
5.0 (2 reviews)

Instant Download & Simulator Access

Secure SSL Encrypted Checkout

100% Money Back Guarantee

What Users Are Saying:

Rated 5 out of 5

“The practice questions were spot on. Felt like I had already seen half the exam. Passed on my first try!”

Sarah J. (Verified Buyer)

Download Free Demo PDF Free AZ-305 Practice Test
Shopping Cart
Scroll to Top

FLASH OFFER

Days
Hours
Minutes
Seconds

avail $6 DISCOUNT on YOUR PURCHASE