Working Groups

As an interdisciplinary and collaborative group, our working groups and reading groups are imperative to continue expanding our knowledge, keeping up to date with recent publications, and looking back at key theory to understand our current position. 

Details on the objectives of Working Groups and the respective reading groups, can be found below. 

Large Language Model reading group- closed group

Led by Wendy Chun and Matt Canute (September, 2023 – ongoing)

Through a multi-disciplinary list of readings, the group aims to compare the recent debate driven by the popularity of large language models with similarly previous discussions that took place in the humanities by critical and cultural theorists. By understanding these earlier structural and post-structural arguments around language and meaning, we might have a better way to articulate the upcoming issues and implications in producing and consuming opaque data-driven language models.

History of math reading group- closed group

Led by Stephanie Dick (Fall, 2023)

The group will be reading in on the history of machine learning and mathematics together to familiarize ourselves with some of the techniques that are at work, where they came from, and to look for opportunities to experiment and rethink. We are going to investigate, specifically, how historical understanding might inform alternative mathematical approaches. We will look to the history of ventilation and sentiment analysis to ask things like ‘how has this problem domain be defined?’ ‘What variables have been deemed relevant and irrelevant here, when, by whom, and why?’ ‘How has this problem been understood differently in other contexts, and how would the mathematics look different in tandem?’ Our hope is that historical research will guide us in imagining alternative ways of working mathematically within these problem domains.

Indigenous Epistemologies Reading Group- closed

Led by Karrmen Crey (Aug 2023- ongoing)

Discussions about Indigenous engagement with information technologies, algorithmic systems, and data science are concerned with how these systems and technologies can be shaped by and support Indigenous worldviews, values, languages and relations. Such discussions invoke a host of concepts associated with Indigenous epistemologies, such as “oral traditions,” “sovereignty,” “relationality,” and “kinship,” concepts that deserve and require closer attention. This reading group will examine texts that theorize and debate these concepts to support deeper understanding of the contexts and nuances of contemporary debates surrounding Indigenous epistemologies and machine learning.

Imaginative Methods Working Group – closed​

Led by Gillian Russell & Frédérik Lesage

(Sept 2022 – closed group, expressions of interest should be directed to gillianr@sfu.ca)

This working group is a fortnightly gathering of graduate students across various schools who are employing, or aim to employ, imaginative methods as a part of their research. Recent scholarship in the social sciences, humanities, art and design has drawn attention to the politics of the imagination, arguing for the radical imagination as a key device for thinking and acting in times of crisis (Escobar, 2018; Haiven, 2014; Keeling, 2019; Khasnabish & Haiven, 2017). These scholars and practitioners define the radical imagination as a way to “reimagine the imagination”, and as a collective process that holds the potential to animate new ways of perceiving and thinking the world. By bringing together the radical and imagination, they suggest a methodological commitment to the transformation of reality. However, while calls for the imagination abound, to date there are few resources to help researchers incorporate this collective process into their own research practices. This group sets out to explore, assess, and analyze various methodological approaches for animating the radical imagination through method.

Experimental Algorithmic Futures Working Group – closed

This working group will take on a fundamental flaw within current predictive algorithmic systems—their reduction of the future to the past—through cross-disciplinary collaboration. In particular, we will focus on redressing current limitations of content moderation algorithms for online abuse by: 1) determining the technical grounds by which these systems unfairly target the communities they are supposed to protect; 2) devising and imagining alternative socio-technical systems for countering abusive language and understanding online conflict in collaboration with the members of the Intersectional Technology Project and Night School for Data Fluencies and through the Data Fluencies Theatre Project; and 3) producing criteria for evaluating these systems that assess how they can be deployed to help usher in just futures, as imagined by Indigenous and Afro-futurisms. This will entail some innovative technical work, especially in terms of fine-tuning language models to better support the practices of targeted communities, but the most pressing and difficult task will be devising criteria for how, if, and when these systems should be used, to what end, and in combination with what other methods and actions.

Find out more about the research here.

Pollution: Elemental Media / Weapon of War – closed​

Led by Svitlana Matviyenko with Rahul Mukherjee

This working group will bring together the notions of elemental media with the topic of pollution and war pollution, in particular. How does thinking media elementally help us understand pollution as a weapon of war?