Shiran Dudy

prof_pic.jpg

Hi, I’m Shiran — I research the risks AI systems pose to equity and access, and I design AI tools that center the people they’re meant to serve.

I’m a Research Scientist at Northeastern University’s Institute for Experiential AI, where I specialize in Responsible AI specifically via auditing AI systems as a socio-technical system (having their societal impact in mind), and propose guardrails to protect end-users. I’m also building and experimenting with AI tools to strenthen and empower users to enhance our democracy.

What I do

Guardrailing AI: I audit AI systems to uncover what they get wrong—and who gets left out. That means examining how generative AI represents (or misrepresents) non-dominant cultures, evaluating how AI-powered search shapes access to real-world opportunities, and measuring the community impact of AI-driven services like Uber in collaboration with Eticas.

Strenthening Democracy: I also lead the Tech Policy Tracker, making federal and state AI policy accessible to everyday people—because these decisions affect all of us, and everyone deserves a seat at the table. In addition, I have employed participatory approaches using tools as Pol.is to promote more equitable governance in communities.

As technical lead for RAI consulting at EAI, I help companies build AI that’s safe, robust, and equitable.

Email: shirdu2 at gmail dot com

news

Apr 24, 2026 We got our paper accepted to FAccT 2026! “Taking Stock at FAccT”: Using Participatory Design to Co-Create a Vision for the Fairness, Accountability and Transparency Community” 🎉
Feb 6, 2026 It was a pleasure to speak at the Future of Science Seminar where I led a conversation on “Navigating Risk - From Awareness to Accountability.” We examined the multi-tiered harms of AI systems across individuals, communities, and society at large, as well as inspiring accountability initiatives from Consumer Reports, Eticas, Radical Exchange and DataMined, sparking important discussions on new mechanisms to hold AI companies accountable 🛡️⚖️
Oct 7, 2025 I had a really great time at Notre Dame RISE AI summit. I gave a talk about how many of our epistemic systems promote single view approach, and why promoting plurality in search systems (as well as LLMs) may offer an antidote to this concern. The world is complex and current personalization techniques are narrowing our understanding of it🔍👁️👁️
Aug 21, 2025 It was a pleasure to take part in the Responsible AI workshop series at EAI where I led a discussion on how well commercial LLMs connect us with real-world opportunities. I presented a recent study we conducted showing that LLM responses across state-of-the art models skew towards wealthier and more educated poulation – making its responses less relevant for other life experiences📊🌍
Jul 11, 2025 I had a great time at FAccT this year! I joined a CRAFT panel on community-led audits and shared a study (with Eticas) auditing Uber’s services with the Roma community. I also co-hosted a participatory design session using pol.is to explore how we imagine the future of the FAccT community together.

latest posts

selected publications

  1. home_divide.png
    Unequal Opportunities: Examining the Bias in Geographical Recommendations by Large Language Models
    Shiran Dudy, Thulasi Tholeti, Resmi Ramachandranpillai, and 3 more authors
    In Proceedings of the 30th International Conference on Intelligent User Interfaces, 2025
  2. sad_happy.png
    Analyzing Cultural Representations of Emotions in LLMs Through Mixed Emotion Survey
    Shiran Dudy, Ibrahim Said Ahmad, Ryoko Kitajima, and 1 more author
    In 2024 12th International Conference on Affective Computing and Intelligent Interaction (ACII) , 2024