1. Microsoft Learn, "The six principles for responsible AI": Under the "Inclusiveness" section, it states, "AI systems should empower everyone and engage people. When we design these systems, we need to address a broad range of human needs and experiences. Inclusive design practices can help developers understand and address potential barriers that could unintentionally exclude people." Adding a voice interface is a direct application of this principle.
Source: Microsoft Learn, Module: "Introduction to responsible AI", Unit: "What are the guiding principles for responsible AI?".
2. Microsoft Learn, "About Direct Line Speech": "Direct Line Speech is a robust, end-to-end solution for creating a flexible, extensible, voice assistant... It's powered by the Bot Framework and its Direct Line Speech channel, which is optimized for voice-in, voice-out interaction with bots." This documentation confirms that Direct Line Speech is the specific technology for adding voice accessibility.
Source: Microsoft Learn, Azure Bot Service Documentation, "Direct Line Speech overview".
3. Microsoft Learn, "Add authentication to a bot": "Often a bot must access secured resources on behalf of the user... To do this the bot must be authorized to access these resources based on the user's credentials." This confirms authentication is for security and authorization, not inclusiveness.
Source: Microsoft Learn, Azure Bot Service Documentation, "Add authentication to a bot in Bot Framework SDK".
4. Microsoft Learn, "Active learning": "Active learning is the process of selecting the new utterances that are unclear and adding them to your training data. This helps the model become more accurate." This defines active learning as a model improvement technique, related to reliability.
Source: Microsoft Learn, Azure AI services Documentation, "Active learning".