How (Not) to Open Up the AI Black Box
HKU AI & Law Lecture Series
Date: June 15, 2023 (Thursday)
Time: 6:30pm – 7:30pm
Venue: Academic Conference Room, 11/F Cheng Yu Tung Tower, The University of Hong Kong
Speaker: Dr. Boris Babic, Assistant Professor, Faculty of Arts & Science, University of Toronto
There is an increasingly strong consensus that so-called “black box” AI decision making undermines many important values traditionally associated with legality and the rule of law. This is especially true when legal decisions are automated — such as those pertaining to bail and parole — but it is also true in less formal contexts — such as in medical resource allocation. This lecture will investigate what makes a decision making algorithm opaque, and evaluate several different methods for increasing algorithmic transparency. We will also consider some more philosophical questions, such as whether it is meaningful to talk about the “reasons” for why algorithms act, and whether it makes sense to speak of an algorithm’s “intent.”
Dr. Boris Babic is an Assistant Professor at the University of Toronto, where he has a joint appointment in the Department of Statistical Sciences and the Department of Philosophy. He is also a faculty fellow of the Schwartz Reisman Institute for Technology and Society, and a visiting assistant professor of Decision Sciences at INSEAD. He received a PhD in Philosophy and an MSc in Statistics from the University of Michigan, Ann Arbor, and a JD from Harvard Law School. He completed his postdoctoral scholarship at the California Institute of Technology (Caltech). His primary research interests are in legal, ethical, and policy dimensions of artificial intelligence and machine learning.
Chair: Professor Hualing Fu, Dean and Warren Chan Professor in Human Rights and Responsibilities, Faculty of Law, The University of Hong Kong
All are welcome! To register, please go to https://hkuems1.hku.hk/hkuems/ec_regform.aspx?guest=Y&UEID=88335.
For inquiries, please contact Ms. Grace Chan at firstname.lastname@example.org or 3917 4727.