Q: 18
CASE STUDY
Please use the following answer the next question:
XYZ Corp., a premier payroll services company that employs thousands of people globally, is
embarking on a new hiring campaign and wants to implement policies and procedures to identify
and retain the best talent. The new talent will help the company's product team expand its payroll
offerings to companies in the healthcare and transportation sectors, including in Asia.
It has become time consuming and expensive for HR to review all resumes, and they are concerned
that human reviewers might be susceptible to bias.
Address these concerns, the company is considering using a third-party Al tool to screen resumes
and assist with hiring. They have been talking to several vendors about possibly obtaining a third-
party Al-enabled hiring solution, as long as it would achieve its goals and comply with all applicable
laws.
The organization has a large procurement team that is responsible for the contracting of technology
solutions. One of the procurement team's goals is to reduce costs, and it often prefers lower-cost
solutions. Others within the company are responsible for integrating and deploying technology
solutions into the organization's operations in a responsible, cost-effective manner.
The organization is aware of the risks presented by Al hiring tools and wants to mitigate them. It also
questions how best to organize and train its existing personnel to use the Al hiring tool responsibly.
Their concerns are heightened by the fact that relevant laws vary across jurisdictions and continue to
change.
If XYZ does not deploy and use the Al hiring tool responsibly in the United States, its liability would
likely increase under all of the following laws EXCEPT?
Options
Discussion
B. since XYZ is using the tool not building it. Product liability would generally hit the vendor not the end user. The others (A, C, D) definitely apply if XYZ mishandles things. Anyone disagree?
B
Its B since product liability mainly hits the vendor who built the tool, not XYZ using it. The end user is far more at risk under discrimination or privacy laws if things are misused. Saw similar logic in some official practice questions, but willing to hear other views in case I missed something.
Honestly, I think the key is that product liability laws are more about the vendor’s responsibility. If XYZ misuses the AI tool, they'd run into trouble with discrimination or privacy laws instead. Pretty sure B is correct here, but open to hearing otherwise.
Not A, B. Love how clear they made this scenario.
Be respectful. No spam.