We are excited to welcome ECC members back and kick off the new term with a social 2nd October 6pm at The Castle Inn (upstairs). It’s a great opportunity to meet other ECRs working on AI across departments, make friends, and chat about the latest AI research. Clare Heinbaugh and Abdullah Hassan Safir will be presenting their newest work in a 10-min flash talk format. More information about the speakers and the talks can be seen below.
Imagining a Pluriversal future in AI – Abdullah Hassan Safir
In this talk I would highlight a recent intervention, a digital archive and a planned exhibition at BritCHI 2025 in Cardiff, Wales. The archive results from a previously conducted Human-AI interaction experiment in which I tried to reproduce select artworks by a Bangladeshi artist through systematic prompting techniques using a T2I AI tool. Through the exhibition, we intend to spark thought-provoking conversations around how current HCI research around AI art-making practices marginalises Global South perspectives. Through this interactive effort, I want to highlight that the design and functioning of T2I generative AI tools should not be considered as merely technical feats but as also cultural artifacts carrying the residues of colonial thought. Such decolonial and pluriversal design-thinking around AI art-making will help include diverse voices and visions of artists worldwide within the realm of such practices.
Tactical Control in 3D via Gesture Disambiguation, Speech, and Runtime Code Generation – Clare Heinbaugh
Shared control methods allow a user and system to work in tandem to accomplish tasks. For 3D tasks, hand gestures are a natural choice to represent 3D primitives such as points, axes, angles, and objects in mixed reality applications. Large language models infer user intent and enable runtime code generation to define novel interactions on-the-fly. In this talk, Clare will discuss a novel method called GestCoGen that uses gestures to represent 3D primitives used to augment speech commands and runtime code generation to define application-agnostic commands at runtime. GestCoGen enables flexible commanding and promotes high-level tactical control. In a user study, she compared GestCoGen to a traditional menu-based system in a series of 3D modeling tasks. With the ability to define new commands at runtime, she will explain how GestCoGen can be used to promote tactical control in a shared control system, which is vital to investigate as LLMs and other machine learning-enabled systems become more ubiquitous.
Speakers
Abdullah Hassan Safir
Abdullah Safir is an AI ethics and critical design researcher interested in developing responsible, inclusive AI systems that respond to the Majority World/ Global South contexts. He is currently working towards his PhD in Interdisciplinary (AI) Design at the University of Cambridge. He is a member of Cambridge Collective Intelligence and Design (CamCID) Group (led by Dr Ramit Debnath) and Critical Design Studio, Computer Lab (led by Professor Alan Blackwell). Safir also serves as an early career community member at the Centre for Human Inspired AI (CHIA) and contribute to an AI@Cam project around ethically rooted AI for public value. Safir’s PhD research is generously funded by Gates Cambridge Trust.
Clare Heinbaugh
Clare is entering her second year as a PhD student in Engineering at the University of Cambridge. She is supervised by Per Ola Kristensson in the Intelligent Interactive Systems lab, focusing on machine learning and human-computer interaction research and applications. Clare’s research centers on virtual reality and gestures as input to interactive systems.