The remaining 6 videos from the the University of San Francisco Center for Applied Data Ethics Tech Policy Workshop are now available. This workshop was held in November 2019, which seems like a lifetime ago, yet the themes of tech ethics and responsible government use of technology remain incredibly relevant, particularly as governments are considering controversial new uses of technology for tracking or addressing the pandemic.
You can go straight to the videos here, or read more below:
- Hypervisibilizing the unseen: Dominant narratives, smart cities and race-blind tech policies, Tawana Petty
- Fairness, Accountability, & Transparency: Lessons from predictive models in criminal justice, Kristian Lum
- Diverse Faces, Diverse Lenses: Applied ethics and facial recognition research, Irina Raicu
- Panel on Government Policy, Linda Gerull, Brian Hofer, Lee Hepner, and Heather Patterson
- Deconstructing the Surveillance State, Rumman Chowdhury
- Law and Data Driven Innovation, Deven Desai
And be sure to check out the full playlist of workshop videos here!
Hypervisibilizing the unseen: Dominant narratives, smart cities and race-blind tech policies
If you teach the world to fear the other, individualization and hyper surveillance become inevitable. Detroit is an incredible example of how the power of propaganda became a toolkit for race-blind policies with racist consequences, data and tech misuse, digital surveillance and the dangerous conflation between safety and security. “Be Afraid, Be Very Afraid,” is no longer just the tagline from a classic movie, it’s become a mantra for being and an excuse not to see each other. Tawana Petty is Director of the Data Justice Program for the Detroit Community Technology Project and co-leads Our Data Bodies, a five-person team concerned about the ways our communities’ digital information is collected, stored, and shared by government and corporations. Watch her talk here:
Fairness, Accountability, and Transparency: Lessons from predictive models in criminal justice
The related topics of fairness, accountability, and transparency in predictive modeling have seen increased attention over the last several years. One application area where these topics are particularly important is criminal justice. In this talk, Dr. Lum gives an overview of her work in this area— spanning a critical look at predictive policing algorithms to the role of police discretion in pre-trial risk assessment models to a look behind the scenes at how risk assessment models are created in practice. Through these examples, she demonstrate the importance of each of these concepts in predictive modeling in general and in the criminal justice system in particular. Kristian Lum is an assistant research professor at Penn CIS and the lead statistician at the Human Rights Data Analysis Group (HRDAG), where she leads the HRDAG project on criminal justice in the United States. Watch her talk here:
Diverse Faces, Diverse Lenses: Applied ethics and facial recognition research
Ethical issues are like birds: they are pervasive, varied, and often go unnoticed (especially by those not trained to identify them). Ethical “lenses” (or approaches) can help us see them. This presentation will introduce the Markkula Center for Applied Ethics Framework for Ethical Decision-Making, which features five ethical lenses. Attendees will then work together to apply those lenses to a case study that reflects the complexity of ethical decisions faced by practitioners who work with data. Irina Raicu is Director of the Internet Ethics Program at the Markkula Center for Applied Ethics.
Watch her talk here:
Panel on Local Government
There are significant challenges to creating informed tech policy, including: the diverse range of stakeholders involved, the way silicon valley incentives are misaligned with reflective policy making, binary modes of thinking, that munipalities are often an afterthought to tech companies, the gap between intended and actual use, and more. Our panel on local government had a lively and informative discussion. Panelists included:
- Linda Gerull, CIO of City and County of San Francisco
- Brian Hofer, founder of the non-profit Secure Justice whose work drafting SF’s ban on facial recognition was covered in the New York Times
- Lee Hepner, an attorney and legislative aide who worked on SF’s facial recognition ban
- Heather Patterson, privacy researcher at Intel and member of the Oakland Privacy Advisory Commission
Watch the panel here:
Deconstructing the Surveillance State
The smart city has become co-opted by an exclusionary narrative that enables a surveillance state. In this talk, Dr. Chowdhury presents the global imperative to deconstruct the current surveillance state by illustrating already-existing harms. In it’s place, she shares a vision of Digital Urban Design, which presents a community-driven and collaborative smart city. A work in progress, the goal of digital urban design is to evolve the field of urban design to merge the digital and analog fabrics in a way that impacts and improves the lives of citizens. Rumman Chowdhury is the the Global Lead for Responsible AI at Accenture Applied Intelligence, where she works with C-suite clients to create cutting-edge technical solutions for ethical, explainable and transparent AI. Watch her talk here:
Law and Data Driven Innovation
Data-driven innovation fueled Silicon Valley and more after the first dotcom bubble. It still has potential to drive incredible outcomes. Yet the days of deference to companies because of promised innovation and creation of wealth seem to be over. This talk looks at where we came from and how changes in the law show dissatisfaction with innovation narratives. And yet, the talk offers that there is a way to use data and software to build trust and success going forward. Deven Desai is an associate professor in the faculty of the Law and Ethics Program at Georgia Tech’s Scheller College of Business. Watch his talk here:
More CADE Videos
You can check out the full playlist of videos from the CADE Tech Policy Workshop here!
Be sure to read these posts with other videos and material from CADE events: