Link: UCOP's e-newsletter

Stay Informed. Stay Connected.

UCOP AI risk assessments help safeguard data

Artificial intelligence is revolutionizing the way we work, making tasks easier and providing valuable insights. But while many new AI tools and features are exciting, some programs can cause security risks, undermine trust or raise ethical concerns.

Before you use a new AI tool or enable a new AI feature on already licensed software, make sure it’s been approved by a UCOP AI risk assessment.

What’s an AI risk assessment?

UCOP Information Security, Legal, and Privacy teams thoroughly review the AI tool’s intended uses, including any sensitive information that may be included. Then, they review whether the vendor’s contractual terms and technical safeguards provide adequate protection for our data.

Why do we need a risk assessment?

At UCOP, we perform sensitive tasks that impact thousands of students, patients, employees, retirees and other members of the UC community.  We must be sure to use AI tools according to the UC Responsible AI Principles to limit potential risks, including:

  1. Security risks: Without proper oversight, AI tools can have security gaps, making them vulnerable to data breaches and cyberattacks.
  2. Accuracy, reliability and safety: Unreliable or inaccurate systems can undermine trust and cause unexpected or undetected harm.
  3. Ethical concerns: Biases in AI algorithms and data that train AI models can lead to unfair and even discriminatory results. Vendors should have an appreciation of these concerns through transparency and continuous monitoring of their products to promote equity and fairness.

Sections 14 and 15 of UC’s Electronic Information Security provide more details on requirements for acquiring software and working with vendors.

How can I request an AI risk assessment?

Email the ITS Service Desk with “AI Risk” in the subject line. Include details about the AI tool or feature you would like to use.

How can I learn more using AI at UCOP?

Read our December 2024 update for more details on AI tools that are currently available at UCOP, ongoing AI initiatives at UCOP and systemwide AI resources.

Editor’s note: This article was contributed by April Sather, chief information security officer, UCOP; Sajjad Matin, principal counsel, cybersecurity, UC Legal; and Al Lavassani, privacy officer, UC Ethics Compliance and Audit Services.

Tags: , , , , ,

Leave your comment here