Mutual trust and understanding are essential to preventing harmful friction in any relationship. When it comes to the AI-driven systems which are now deeply embedded into our lives, this trust and understanding can be sorely lacking. Gaps between how these systems work and how people imagine they do, and how people work and how systems imagine they do, cause friction and distrust. In turn, this friction and distrust prevents us from building productive, mutually-beneficial relationships between users and systems that would allow us to get the full potential benefit out of these systems. I work to close these gaps between people and systems by developing new theoretical frameworks for sociotechnical system design that consider how both systems and users understand and symbiotically adapt to each other.

I work at the intersection of technology and social science, drawing on Human-Centered Computing (HCC), Computer Supported Cooperative Work (CSCW), Computer Mediated Communication (CMC), Social Computing, and Cognitive Psychology. I take a qualitative, participatory approach that focuses on lived experience and perception to examine breakdowns in the relationship between people and AI-driven systems.I adopt a sociotechnical perspective on the relationship between users and AI systems to find ways to reduce misunderstandings, build trust, and promote mutually beneficial adaptation. I take a transfeminist design stance so that my work recognizes the dual necessity of centering marginalized communities while also creating systems that support all users, and the importance of embracing the complexity of relationships to this goal. 

Across my work, I focus on marginalized users, often working with communities I am a member of, such as the queer and trans community. Marginalized users experience the most extreme consequences of breakdowns in the person/AI system relationship, and who have the most experience in trying to adapt around them. This allows me to contribute both theory and generalizable design implications which can help improve the person/AI system relationship broadly, as well as specific, community-based knowledge that helps us better understand how these systems fail marginalized users and provide immediate relief to those who need it the most.

Right now, my research program has two main areas, though these areas increasingly share in-house methods and contribute to each other’s development.

I work to close the gap in people’s understandings of platforms by developing theoretical approaches such as folk theorization that capture not only user understanding, but the user’s emergent, adaptative relationship with the system. In turn, I use these theoretical frameworks to pursue goals such as boosting algorithmic literacy, or the capacity of users to understand and use AI-driven systems to accomplish their goals. These frameworks for Human-AI collaboration and algorithmic literacy will be crucial for a successful future for AI in our social systems.

Community-Based Member Research to Empower Marginalized LGBTQ+ Users

I work to close the gap in platforms’ understandings of people by developing innovative online qualitative methods which empower communities to work together to express points of friction and distrust with systems and propose solutions. I embrace my positionality as a member-researcher with a deep knowledge of both social computing research and her own marginalized communities. In turn, I enable others to do the same by assembling, securing funding for, and leading teams of LGBTQ+ junior researchers.

Latest Peer-Reviewed, Archival Publications

“I See Me Here”: Mental Health Content, Community, and Algorithmic Curation on TikTok

Ashlee Milton, Leah Ajmani, Michael Ann DeVito, and Stevie Chancellor. 2023. “I See Me Here”: Mental Health Content, Community, and Algorithmic Curation on TikTok. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), April 23–28, 2023, Hamburg, Germany. ACM, New York, NY, USA, 17 pages.

How Transfeminine TikTok Creators Navigate the Algorithmic Trap of Visibility Via Folk Theorization

Michael Ann DeVito. 2022. How Transfeminine TikTok Creators Navigate the Algorithmic Trap of Visibility Via Folk Theorization. In Proceedings of the ACM on Human-Computer Interaction, Vol. 6, CSCW2, Article 380 (November 2022), 31 pages,

“Do You Ladies Relate?”: Experiences of Gender Diverse People in Online Eating Disorder Spaces

Jessica L. Feuston, Michael Ann DeVito, Morgan Klaus Scheuerman, Katy Weathington, Marianna Benitez, Bianca Z. Perez, Lucy Sondheim, and Jed R. Brubaker. 2022. “Do You Ladies Relate?”: Experiences of Gender Diverse People in Online Eating Disorder Communities. Proc. ACM Hum.-Comput. Interact. 6, CSCW2, Article 420 (November 2022), 32 pages.

Adaptive Folk Theorization as a Path to Algorithmic Literacy on Changing Platforms

Michael Ann DeVito. 2021. Proceedings of the ACM on Human-Computer Interaction, 5, CSCW2, Article 339.

Values (Mis)alignment: Exploring Tensions Between Platform and LGBTQ+ Community Design Values

Michael Ann DeVito, Ashley Marie Walker, and Julia R. Fernandez. 2021. Proceedings of the ACM on Human-Computer Interaction, 5, CSCW1, Article 88.