Usman Gohar

headshot-new.jpg

**Looking for summer 2025 research internships**

Usman Gohar is a PhD student and F. Wendell Scholar in Computer Science at Iowa State University, researching at the intersection of machine learning and software engineering, with an emphasis on operationalizing software and AI safety, algorithmic fairness, and harm mitigation. He has published and peer-reviewed several research papers in top SE, ML, and AI Ethics venues and organized academic workshops. Usman also works on developing systematic methods for identifying and evaluating safety (including bias and harm) in data-driven software (ML/AI) and drone safety.

Usman is advised by Dr. Robyn Lutz and is a member of the Laboratory for Software Safety.

Previously, he has worked as a Data Scientist in different sectors like agriculture, manufacturing, and power systems, specializing in forecasting, predictive analytics, and model deployment.

News

Sep 2024 Excited to announce that I am co-organizing our NeurIPS 2024 Workshop, “Evaluating Evaluations: Examining Best Practices for Measuring Broader Impacts of Generative AI” aka EvalEval 2024! with the brilliant Irene Solaiman, Zeerak Talat at Hugging Face. Call for papers will be out soon! See you at NeurIPS!
Sep 2024 Invited to be on the Program Committee for AAAI 2025!
Aug 2024 My work “CoDefeater: Using LLMs To Find Defeaters in Assurance Cases” has been accepted at ASE (NIER) 2024! We evaluate using LLMs to assist in red-teaming by identifying defeaters and simulating diverse failure modes. See you in Sacramento!
Jul 2024 Invited to be part of ICSE 2025 Shadow Program Committee!
Jul 2024 Excited to announce that our paper “Evaluating the Social Impact of Generative AI Systems in Systems and Society” with Irene Solaiman, Zeerak Talat and other fantastic researchers has been accepted to appear as a Book Chapter in Hacker, Engel, Hammer, Mittelstadt (eds), Oxford Handbook on the Foundations and Regulation of Generative AI. Oxford University Press
May 2024 Invited to be on the Program Committee for AIES 2024!
Apr 2024 Our work, “A Family-Based Approach to Safety Cases for Controlled Airspaces in Small Uncrewed Aerial Systems” has been accepted at AIAA’24!
Mar 2024 Invited to be an Ethics Reviewer for ICML 2024!
Mar 2024 Invited to be on the Program Committee for TrustNLP: Fourth Workshop on Trustworthy Natural Language Processing at NAACL’24!
Mar 2024 Invited to be on the Committee for Artifact Evaluation for ISSTA 2024!
Mar 2024 Invited to be an Ethics Reviewer for COLM 2024!
Mar 2024 Invited as a reviewer for CHI and CSCW 2024!
Feb 2024 Presenting a poster of our work at Imagin Aviation 2024!
Feb 2024 Invited to be on the Program Commitee for IJCAI’24 Survey Track!
Jan 2024 Invited to be Volunteer Co-Chair for FAccT 2024!
Dec 2023 Paper on fairness and equity in engineering automated software systems for drones has been accepted at ICSE!
Aug 2023 Invited to be on the Program Committee for AAAI 2024!
Jul 2023 Invited to be an Ethics Reviewer for Neurips 2023!
Jul 2023 Our work on Intersectional Fairness has been featured by Montreal AI Ethics Institute
Jun 2023 I will be giving a talk on Fairness in ML, specifically ensembles in Data Tech 2023 in Minneapolis, MN on June 9th.
May 2023 This summer, I will be working at Seagate as a Ph.D. Data Science Intern!
May 2023 I will be a volunteer at FaccT 2023. See you in Chicago!!
Apr 2023 Our work on Intersectional Fairness has been accepted at IJCAI’23 Survey Track!
Feb 2023 ICSE paper artifact accepted and got two badges!
Dec 2022 Our work titled “Towards Understanding Fairness and its Composition in Ensemble Machine Learning” has been accepted at ICSE’23!
Nov 2022 Nominated for Midwest Teaching Exellency Award!
May 2022 Received Teaching Excellency Award from Iowa State University
Apr 2022 Organized the Women in Data Science (WiDS) event at ISU and served as an ambassador of WiDS Global, Stanford University.
Jun 2021 I will be volunteering for FAccT’21 (South Korea - Virtual).
Aug 2020 Awarded ACM travel grant and registration waiver for KDD 2020!