1. NIST AI Risk Management Framework (AI RMF 1.0): The "Govern" function of the framework emphasizes establishing policies and procedures for third-party AI systems. Specifically
section 4.2.3
"Third Party Risk Management
" highlights the need to "understand and manage the risks associated with third-party AI actors and entities across the AI lifecycle." This directly implies the need for clear terms on updates and support. (Source: NIST
AI RMF 1.0
January 2023
Page 21).
2. Academic Publication on AI Governance: In "A governance framework for the application of AI in an enterprise context
" the authors discuss the importance of lifecycle management. They state
"The AI lifecycle does not end with deployment... Continuous monitoring of the model’s performance is necessary to detect model drift... and trigger retraining or replacement." This underscores the necessity of pre-defined vendor responsibilities for updates. (Source: Wirtz
B. W.
Weyerer
J. C.
& Geyer
C. (2019). Journal of Business Research
98
263-272. Section 4.3. DOI: https://doi.org/10.1016/j.jbusres.2019.01.032).
3. University Courseware on AI Risk: Materials on managing AI systems often differentiate them from traditional IT. The need for "continuous validation" and managing "technical debt" in machine learning systems is a key theme. This directly relates to the vendor's role in providing updates and support to prevent model degradation. (Source: Based on concepts taught in courses like Stanford's CS229: Machine Learning
which covers the practical lifecycle of ML models).