Mastodon: Research Symposium and Tool Exploration Workshop
Mastodon: Research Symposium and Tool Exploration Workshop
During 22nd and 23rd of June, the Centre for Interdisciplinary Methodologies, in collaboration with the Centre for Digital Inquiry and the Sustainable Cities GRP will be holding the Mastodon: Research Symposium and Tool Exploration Workshop, a hybrid event featuring 18 talks from researchers and activists from +10 countries, and a guest lecture from Robert W. Gehl.
Key information:
- Registration linkLink opens in a new window (online)
- Schedule Link opens in a new window
- Public folder with slidesLink opens in a new window
- Pre-recorded talksLink opens in a new window
Organizing Committee
- Carlos Cámara-Menoyo (Senior Research Software Engineer)
- Nathaniel Tkacz (Reader)
- Fangzhou Zhang (PhD Candidate)
- Emellyne Forman (GRP Programme Administrator)
Day 1: Talks
The Road to AOIR.social: A critical genealogy
Robert W. Gehl
Just days after Elon Musk finalized his purchase of Twitter in 2022, the Association of Internet Researchers (AOIR) met in Dublin for their annual conference. One of the darlings of the conference was Mastodon, the Twitter alternative, which was attracting many new users in the wake of Musk's takeover of Twitter. The AOIR executives decided to take the plunge and host a Mastodon instance for members of the organization, and I was asked to be the admin of the new instance. Drawing on critical genealogical methods and interviews I'm conducting for a book about the fediverse, this presentation considers the historical events that made AOIR.social possible -- and that reveal the dangers this AOIR.social experiment faces. First, I will argue that AOIR.social was made possible by the struggles of queer, trans, and Black developers and admins, who developed hard-won knowledge about content moderation in covenantal federated systems. If AOIR.social does not learn from this history of struggle, it will erase the work of marginalized people and deny AOIR's
own stated goals of decolonizing the Internet. Second, AOIR.social also operates in the shadow of 20 years of corporate social media history. This includes 20 years of research methods developed in an often antagonistic relationship with corporate social media -- methods that may be inappropriate and unethical on the fediverse. In fact, many fediverse members are on the fediverse precisely because of concerns over research practices in corporate social media. Given that many of these methods were developed by AOIR members, if AOIR.social is seen as promoting unethical research on the fediverse, it runs the risk of being defederated from the rest of the fediverse and thus failing. Whether or not AOIR.social can succeed will depend upon our ability to learn from the fediverse's history.
Visit the Schedule for time slots and bios.
Pre-recorded talk (Video) (password: $1VsrEte)
Emerging forms of digital communication are further facilitating rapid data circulation, collection, storage and processing. Interactions consequently can be unpredictable. This creates a 'darker turn' in neo-communicative practices. The term 'dark' will be used in this presentation to refer to communication that has either limited distribution, is not open to all users' closed groups by way of example (Gehl 2019; 2018) and is veiled. I will present the conceptual framing of 'dark' I developed through an interpretation of intimacy for the ways emerging dark social spaces might clarify especial communication that would otherwise be inaccessible online. A definition of dark can first be underpinned by following a technical description that relates to access whereby information is: (a) hidden by facilitating encryption; (b) hidden through regulatory constrains for select access; (c) hidden because information is closed off to select groups accessible to individuals by invitation only, and; (d) 'dark' through access to required telecommunications services themselves. The provocation herein is that there are three key socio-technological concerns connected to the big social media services that have helped increase the tendency for users to actively seek alternative spaces to network. These are digital privacy and (perceived loss of) agency; platforms monetizing user data by selling it to advertisers, and; negative affects associated with mainstream social media engagement. Dark social spaces are, however, indistinct requiring further study. This is giving rise to new work in what I call 'dark social studies'. To further explore the nature and use of dark social spaces, via post-social media, a digital (auto)ethnographic study of 'dark' social connection was undertaken. The analysis specifically focused on Mastodon over Tor (The Onion Router). The presentation concludes that the interconnected and interactive capacity of dark social spaces facilitates user expectations for dark connection while exposing simultaneously the limitations of our intimate machines.
Key words: Mastodon, Dark Social Studies, Dark Web, Post-Social Media, Intimacy, Critical Studies in Data Culture
"
Pre-recorded talk (Video)Link opens in a new window
In a rapidly changing media ecology like the 'migration' of Twitter users to Mastodon in late 2022, it can be hard to capture people's attitudes and motivations in the moment. This paper draws on critical discourse studies (KhosraviNik & Unger 2016) to investigate the motivations and beliefs of Twitter and Mastodon users in relation to the platforms in question and to social media in general, and explores their attitudes around power, politics and practicalities of social media use in such a rapidly changing situation. We consider the extent to which, in the midst of this rapid change, the risk of losing long-established relations and communities within platform-specific networks (Hine 2017) is an important factor in users' choices around which platforms they use. We collected data from participants working in academia in November 2022 via an initial survey followed by an online focus group. Participants were asked about their own social media use, and power dynamics in relation to the Twitter migration. We analysed specific discursive strategies such as argumentation, to look at how these actors and phenomena are constructed, as well as what people think about dominance issues on social media. Initial findings suggest participants deliberately position themselves in relation to power/social media dynamics, attending both to technical details of platform affordances and wider political/ideological aspects of the 'migration' via a range of strategies. We argue that the growth of Mastodon can be seen as a grass-roots resistance practice in response to the new, over-centralised configuration of Twitter under Elon Musk.
References
Hine, C. (2017). ""Ethnographies of online communities and social media: Modes, varieties, affordances"". In Eynon, R., Fry, J., & Schroeder, R. (eds). The SAGE handbook of online research methods. Sage, 401-415.
KhosraviNik, M., & Unger, J. W. (2016). ""Critical discourse studies and social media: Power, resistance and critique in changing media ecologies"". Methods of critical discourse studies, 205-233."
The migration to decentralized platforms, that is, online platforms that are not subject to any central authority and are part of the ensemble of the so-called fediverse, has become especially relevant today. This is particularly so after the purchase of Twitter by Elon Musk and the ensuing concerns expressed publicly, typically pertaining to issues around the right to free speech and data security. Although not the only one, the most obvious alternative to Twitter appears to have been Mastodon, generally praised for the opportunities for independent communication, which are associated in turn with Mastodon's decentralized infrastructure. Evidence suggests that scientists might constitute an important group of users of Twitter who have seen Mastodon as an alternative. Moreover, the current discussions, among scientists, about the (potential) benefits of decentralized platforms are not unprecedented. Rather, if we were to look at the history of the Internet and it platforms, we could discern discourses with similar sentiments. In this working paper, using examples from the history of the appropriation of the Internet by communities of environmental scientists and being informed by the late scholarly discussions about platform decentralization, my main aim is to offer some insight into the migration from Twitter to Mastodon. Through the qualitative analysis of interview data, archives, and online material of blogs and email lists, I focus on Usenet, a decentralized and distributed communication system created in the late 1970s, and its sci.environment newsgroup. My central argument is that the migration to Mastodon, and platform migration in general, can be best understood as the result of the interaction among the, each time, different associated sociocultural and technological factors, upon which different communication paradigms are based.
Visit the Schedule for time slots and bios.
In 2017, Eugen Rochko - the founder of Mastodon - published a blog post where he compares Mastodon to Twitter. He writes 'Mastodon aims to be a safer and more humane place'. This offers us some valuable insight into why Mastodon is designed the way it is, and helps us situate its design within a larger context of 'toxic behavior', and 'harassment' things that constantly plague the Internet today.
In this study, we attempt to identify design goals that drive the development of Mastodon, as well as specific features that it has incorporated in an attempt to fulfill these goals. We do this by attempting to map changes in Mastodon, _the software_, over time since its release in 2016, and what factors and stimuli prompted such changes.
We examine how effective these design features are in combating problems such as harassment, spam, trolling, and others as compared to competitors, since these are things that plague the Internet as a whole today.
Methodologically, we use primarily three strategies 1) analysis of the Mastodon source code itself as well as associated artifacts - GitHub issues, the CHANGELOG file, etc. 2) analysis of Mastodon blog posts and public posts made by developers 3) interviews with developers.
In a natural extension to this question, we also look at 'forks' and other variants of Mastodon - “Glitch, HomeTown, Pleroma, and others“ what they do differently, and _why_. Why do instance owners choose to run these forks instead of the 'mainline' version of Mastodon?
Through this study, we hope to identify factors and decisions that might have contributed to Mastodon's relative success that its eventual successors might wish to emulate in our move away from a centrally consolidated social media (and web as a whole).
[1] https://blog.joinmastodon.org/2017/03/learning-from-twitters-mistakes/ (2017)
Recent changes in Twitter's ownership have coincided with users of the platform exploring alternatives to the microblogging site. One such alternative is the Fediverse and its most Twitter-like implementation Mastodon. Although Mastodon and Twitter share superficial similarities, they are also remarkably different, for instance in terms of the searchability of content. This then raises the question of how users of the platform perceive Mastodon's affordances, and how they relate these to those of Twitter. The present paper addresses this question through an empirical investigation of how Mastodon's features are perceived online, in particular with regards to its affordances for community-building. By means of NLP techniques, the paper compares discourse on Mastodon about Mastodon's affordances. Toots are thereby analysed from distinct vernacular communities focusing on mentions of the fediverse, Mastodon and Twitter. This includes: 1) communities that have likely been deplatformed from other social media 2) communities seeking a safe space on Mastodon and 3) generic mainstream communities. The deplatformed communities are identified from the federated timeline of an instance that regularly appeared on blocklists by other Mastodon instances. The safe space communities are sourced from the federated timeline of an instance that chose to defederate from the foundation instances mastodon.social and mastodon.online as a safety precaution. The generic communities are snowballed from the federated timeline of an instance that Belgian Twitter users gathered in. The source instances have been explicitly asked for permission to snowball from their federated timeline, other precautions have been taken to operationalise anonymization of scraped messages. By thus comparing metanarratives about Mastodon's affordances across communities, we aim to contribute to our understanding of the medium's (perceived) sociality and community dynamics. We also expect our approach to yield ethical digital methods for Mastodon research.
360 Degrees of Proximities is a feminist artistic/social embodiment of federation, as this is referred to the Fediverse: an ensemble of interconnected servers that are used for web publishing, such as social networking, microblogging, video publishing), which, while independently hosted, can communicate with each other.
Systerserver facilitates setting up 2 Peertube instances, one at Ca la Dona [Barcelona] and one for Broken House [Berlin] and organize a week-long program together with these local communities in Berlin and Barcelona. The idea is for these peertube instances to be locally embedded and sustained, and federated with each other and beyond. The project is a logic step after Systerserver's 2022 efforts supported by Kunstenpunt, where they've setup a Peertube instance on their server, hosted 3 artist residencies, each one month long, and live-streamed several public events. Their Peertube instance grew with feminist, artistic, social and accessibility related video works/experimentation and a question of sustaining and expanding on it.
We would like to contribute to the symposium with our experience after the first event of a feminist federation in Barcelona with Ca La Dona.
As a form of decentralization of power in on-line networks, federation has the curious distinction of being both one of the most widespread existing forms of network decentralization whilst being under-appreciated in both academic and technical communities. On the one hand, there is very little academic research in to on-line federation as such, with most of the focus being on blockchain. Whilst communities of practice, on the other hand, consider it at best a compromise between ideal peer-to-peer architectures and undesirable centralized architectures ('Federation: Treading the Line Between Technical Compromise and Ideological Choice', 2022) or even as 'the Worst of all Worlds' (Lewis, 2018). At the same time, recent developments on corporate social media have rekindled the interest in federated social media software specifically. As an evocative entrypoint that makes the politics of network technologies tangible, federated social media has the possibility to bring federation as an architecture in to the spotlight. This begs the question of what on-line federation as a sociotechnical architecture actually is and allows for. To answer that, I draw from comparative software research to present a typology of different kinds of on-line federation. Additionally, I draw on trace ethnographic methods and participatory observation to discuss how in federated social media, specifically, different kinds of on-line federation are at work simultaneously. Moreover, I also show how the sociotechnical imaginaries (Jasanoff & Kim, 2015) behind federated social media are shifting through wider adoption.
References:
Federation: Treading the Line Between Technical Compromise and Ideological Choice. (2022). In K. Ermoshina & F. Musiani, Concealing for Freedom (1st ed., pp. 148-183). Mattering Press. https://doi.org/10.28938/9781912729227
Jasanoff, S., & Kim, S.-H. (Eds.). (2015). Dreamscapes of Modernity: Sociotechnical Imaginaries and the Fabrication of Power. University of Chicago Press. https://press.uchicago.edu/ucp/books/book/chicago/D/bo20836025.html
Lewis, S. J. (2018, July 10). Federation is the Worst of all Worlds. Field Notes. https://fieldnotes.resistant.tech/federation-is-the-worst-of-all-worlds/"
My presentation explores the potential of Mastodon addressing the proliferation of racism and whiteness on social media. Online racism is challenging to tackle because of its multi-modal, networked formations. It can be propagated by far right groups, trolling cultures and the whiteness of 'ordinary' users. Racism circulates on corporate social media platforms determined by a frictionless attention economy, in which moderation remains weakly implemented.
Mastodon, operating independently, at a different scale offers the promise of an alternative social media space able to more actively challenge racism. For example, Gab - the 'free speech' platform associated with far right and 'extremist' actors - initially deployed Mastodon software. But it was soon blocked by many other instances, effectively isolating Gab from the wider fediverse.
However, people of colour have reported experiencing racism on Mastodon. And Jonathan Flowers (a philosopher of technology) further argues that Mastodon is a 'white space...the norms, the habits, the very structure of that space will take on a likeness to whiteness by virtue of how the majority of people participate in that space' (Technology Policy Press 2022).
In this presentation, I explore Flowers's provocation in relation to decentralization, scale and affordances of Mastodon. In particular, how whiteness (invisibly) organises space, and whether the enduring histories and sociotechnical complexities of online racism can be adequately addressed by Mastodon.
Digital platform monopolists have become unavoidable, global, and autocratic regimes (Cremer et al., 2019; Culpepper & Thelen, 2020; Guggenberger, 2021; Lehdonvirta, 2022). My passion for the internet's technological underpinnings made me realise that this digital imperialism is not a technical necessity: 'Federated platforms'(e.g., ActivityPub/Mastodon, Matrix) are fully-operational ecosystems offering a technologically decentralised alternative. In these networks, competencies are distributed across three (vertically separated) levels of protocol definition, software development, and instance operation. Each layer holds different scopes of governance and power (DeNardis, 2012; Gorwa, 2019; Spagnoletti et al., 2015). All end-users can interact within the same (interoperable) protocol ecosystem, allowing within-market competition (Geroski, 2003) among software developers and instance operators. Only individual instance operators retain their respective users' data, and everyone can self-host an instance. Federated platforms are under-researched and offer a unique research site for the first empirical insight into the competition policy measures 'horizontal interoperability'and 'vertical separation' in a multi-sided market context. As such, for my current MSc at the Oxford Internet Institute and my upcoming DPhil at the Department of Computer Sciences, I am working on understanding the relationship between technical network architectures and their resulting economic power structures.
Visit the Schedule for time slots and bios.
In recent years, communities of cryptography developers have renewed their efforts to create next-generation secure messaging protocols, with a core common objective of creating tools that 'conceal for freedom' while differing in their targeted user publics, the underlying values and business models, and, last but not least, their technical architectures (Ermoshina & Musiani, 2022). This experimentation with different technical architectures has a counterpart in many users' growing mistrust in centralized platforms, endowed, 'by design and by business model', with substantial power to filter content and block user profiles.
In this search for alternatives, so-called 'federated' architectures as the basis of secure messaging and networking are currently experiencing a phase of increased development and use. Federation is believed to help alleviate the very high degree of personal responsibility held by a centralized service provider, while at the same time distributing this responsibility and the material and logistical resources needed by the system, with different possible degrees of engagement, favoring the freedom of users to choose between different solutions and servers according to their particular needs and sets of values.
Rather than focusing on the more 'traditional' online content governance question of whether censoring some of those users is legitimate or not, our paper focuses on the role of informational architectures and infrastructures of federated social media platforms in content moderation processes. Alongside privacy by design (see Cavoukian, 2012), can we speak of online 'safe spaces by design'? We examine in what ways federation can pave the way for novel practices in content moderation governance, merging community organizing, information distribution and alternative techno-social instruments to deal with online harassment, hate speech or disinformation, proposing a model that relies on a multitude of 'safer spaces'. However, this alternative also presents a number of pitfalls and potential difficulties that need to be examined to provide a complete picture of the potential of federated models.
This paper analyses the Fediverse - a platform that proposes a federated infrastructure for microblogging and has been hailed as an example of 'democratic digital commons' (Kwet, 2020) as an alternative model for content distribution and moderation, describing briefly its founding principles and key projects. Understanding information architectures from an STS perspective (Star, 1999; Fuller, 2008), we analyze software as co-producing particular forms of participation, and we examine how protocol and interface properties of these federated platforms can diminish possibilities for disinformation, surveillance and online harassment. We will focus on content moderation practices embedded in the architecture of federated tools, but also show the limits of the 'safe space by design' approach and the decisive role of community. The empirical part of the paper is organized around two case studies, Mastodon and Matrix.org.
References
Cavoukian, A. (2012). Privacy by design. IEEE Technology and Society Magazine, 31(4), 18-19.
Ermoshina, K. & Musiani, F. (2022). Concealing for Freedom: The Making of Encryption, Secure Messaging, and Digital Liberties. Manchester, UK: Mattering Press.
Fuller, M. (Ed.). (2008). Software Studies: A Lexicon. Cambridge, MA: The MIT Press.
Kwet, M. (2020). Fixing Social Media: Toward a Democratic Digital Commons. Markets, Globalization & Development Review, 5(1).
Star, S. L. (1999). The ethnography of infrastructure. American Behavioral Scientist 43(3)"
The fediverse has flourished in the wake of techlash. These small networks have allowed new digital public spaces to emerge. But technology, policy, and culture are always closely intertwined. To fend off tech-bromancization, we look to the past to understand the fediverse's possible futures, with a focus on developing strategies to avoid the pitfalls that have plagued other potential digital public spaces. We draw on political economy research, platform moderation research, software studies, and software activist research.
Preliminary results show five arenas in which given history, the fediverse is vulnerable:
Distributed governance failures. Tensions from distributed governance can result in an accountability and liability vacuum that leads to forking or security vulnerabilities. While co-option of distributed governance has its challenges, corporate actors can also exploit distributed governance for its own gain.
Commercial capture. The fediverse can can be seen as competition to be squashed or acquired, or a new frontier for profit making. Value-added services, cross-platform integration, and proprietary protocols can all be used for the Googlization of the fediverse or the creation of walled gardens.
Moderation nightmares. Digital public spaces have been polluted and become uninhabitable, when moderation goes bad. Mastodon's codes of conduct provide a human-scale response. But megaplatforms' moderation experiences, for better and worse, can benefit an evolving fediverse as well.
Reputational issues. Those threatened by public spaces find it easy to attack them reputationally, partly because antisocial subcommunities use them as well as others (P2P/piracy; Tor/criminality; alt-media/alt-right). Addressing the threat requires not only good management but strategic communication.
Techno-romanticism. Web 2.0 brought utopian visions of social media and user participation while obscuring the underlying market structure and commodification of social labor. Fediverse managers need to recognize and avoid simplistic utopianism."
The presentation will discuss a research project now underway with the University of Alberta and other international partners to introduce Mastodon and other Fediverse platforms to resource-constrained organizations in non-profit or public service roles. This project is one facet of a wider research project looking at the growing 'digital monoculture' and associated concerns about the diversity in the digital habitats of non-profits that have become over-reliant on commercial social media. The aim of the wider project is to better understand how to foster greater degrees of digital self-determination at the community level.
The presentation will describe the project's design and discuss our experience in establishing several Mastodon instances for participatory action research. It will then turn to discuss a new initiative intended to introduce Mastodon and other Fediverse platforms to agricultural communities of practice in the Global South (Sri Lanka and the Caribbean). The presentation will conclude by reflecting on lessons learned and suggesting some priorities for participatory research involving alternative social media and community organizations more generally."
The goal of decentralized social media platforms, such as Mastodon and Pleroma, is to distribute power and control away from a small group of large tech companies. However, this decentralization also creates new challenges, including content moderation. In centralized social networks, a team of human moderators typically oversees content moderation. However, this approach is not feasible in decentralized settings due to a lack of human, financial, and infrastructural resources. As a result, harmful content can spread more easily in decentralized social networks.
In this presentation, I will share the findings from our recent measurement study on the spread of toxicity on one of the largest decentralized micro-blogging platforms, Pleroma. Additionally, I will address the challenges of moderating toxic content in decentralized settings. Finally, I will discuss a collaborative moderation mechanism for (semi-)automated content moderation in decentralized contexts.
I hope that the findings presented in this presentation will help make the decentralized web a safer and more welcoming place for everyone."
As an alternative to Twitter and other centralized social networks, the Fediverse is growing in popularity. The recent, and polemical takeover of Twitter by Elon Misk has exacerbated this trend. The Fediverse includes a growing number of decentralized social networks, such as Pleroma or Mastodon, that share the same subscription protocol (ActivityPub). Each of these decentralized social networks is composed of independent instances that are run by different administrators. Users, however, can interact with other users across the Fediverse regardless of the instance they are signed to. The growing user base of the Fediverse creates key challenges for the administrators, who may experience a growing burden. In this presentation, I will discuss the overhead of moderation on administrators. I will also discuss the proposal of a tool to semi automate the process of moderation to alleviate this overhead on administrators."
Visit the Schedule for time slots and bios.
When Elon Musk completed his $44 billion acquisition of Twitter on October 27, 2022, journalists around the world looked on in alarm. The relationship between the media and Musk had been simmering for years. On December 15, the tension reached a boiling point when Twitter suspended and locked out half a dozen journalists who had criticized Musk (Isaac & Conger, 2022). In response, some journalists threatened to leave Twitter and migrated to Mastodon, an alternative social networking site. They put Mastodon usernames on their Twitter handles and profiles and trumpeted their migration. While some have entirely cut ties with Twitter, it remains unclear whether most journalists have permanently left or merely taken a break from the platform.
This 'platform exodus' raised questions about social media platforms and their alternatives, given that leaving a platform is a complex and ongoing process that involves various stages of disengagement, such as non-use, migration, and the possibility of returning. For journalists, migrating from Twitter can be challenging, given the platform's importance in establishing professional connections, constructing their identity, and promoting their news stories.
To better understand the factors influencing journalists' switching behavior, this study applied the push-pull-mooring model of migration theory. Analyzing 874 journalists' activities on Twitter and Mastodon from October 2022 to April 2023, we found little evidence that journalists had entirely abandoned Twitter for Mastodon. Instead, we observed that journalists posted different content on each platform and seemed to be exploring new approaches and audiences, creating distinct personas on each platform.
This study expands on migration patterns by examining user activities on multiple platforms and highlighting the intricate decision-making processes that individuals and organizations undergo when discontinuing a platform. Overall, it provides valuable insights into journalists' migration behavior and sheds light on the complex relationship between social media platforms and their alternatives."
As the limitations of big tech social media platforms have been more clear, users have begun to migrate to open-source and community-governed alternatives. These non-profit, federated, and open platforms are beginning to face the same challenges that faced original social media platforms as their scale increased. However, the historical trajectory followed by these platforms is not suitable for open alternatives. Commercial platforms build, optimize and maintain their recommendation platforms in great secrecy, as 'crown jewels' of their systems. We propose to re-envision social media recommendations for open and community-governed spaces such as Mastodon. We envision four areas of examination when considering how we re-envision social media recommendations for this type of online environment: (1) What would the governance features look like to collectively run a recommender system, (2) What would a recommender system for an open and community-governed space potentially look like, (3) How could we enable individual users to exercise governance with how these recommender systems are designed and implemented, and (4) How can we balance algorithmic transparency and avoid potential adversarial behaviors with content creators on an open platform?"
As the limitations of big tech social media platforms have been more clear, users have begun to migrate to open-source and community-governed alternatives. These non-profit, federated, and open platforms are beginning to face the same challenges that faced original social media platforms as their scale increased. However, the historical trajectory followed by these platforms is not suitable for open alternatives. Commercial platforms build, optimize and maintain their recommendation platforms in great secrecy, as 'crown jewels' of their systems. We propose to re-envision social media recommendations for open and community-governed spaces such as Mastodon. We envision four areas of examination when considering how we re-envision social media recommendations for this type of online environment: (1) What would the governance features look like to collectively run a recommender system, (2) What would a recommender system for an open and community-governed space potentially look like, (3) How could we enable individual users to exercise governance with how these recommender systems are designed and implemented, and (4) How can we balance algorithmic transparency and avoid potential adversarial behaviors with content creators on an open platform?"
In the face of accelerating climate crisis, the environmental impact of computation, both in terms of energy use, emissions, as well as e-waste, resource depletion and pollution from hardware manufacturing has become a focues of increased interest and concern (see for example Lannelongue et al 2021, Cubitt 2016, Roussilhe et al 2022). Within this context, the computation carried out in the operation of social media platforms has received particular focus (Batmunkh 2022), as has the impact of machine learning and other complex algorithmic computation (Henderson et al 2020). Mastodon advertises itself as a social media service ""without algorithms"" (joinmastodon.org), and this differentiating factor has been identifed as a possible advantage over larger social media platforms in social terms (Kayser-Bril 2022), as well as potentially offering social media services with reduced climate impact, due to the lowered computational costs. However, this impact is challenging to quantify for a number of reasons - on one hand, while Mastodon's open-source, self-hosted nature allows us access to investigate the material impacts of individual servers more directly, it's use of federation makes it challenging to produce representative total or average figures across the entire network, due to the variation between individual instances, at the same time as federation leads to necessary duplication of material across the network. On the other, the closed, propietary nature of commercial social media platforms offers its own challenges - we are reliant on publicly released information to estimate their energetic and climate impact, requiring different methods, and making comparison with Mastodon a particular challenge.
For this reason, we wish to initiate a conversation on the environmental impact of Mastodon servers. With it, we wish to highlight some of the ways in which Mastodon offers new opportunities for investigating the materiality of social media which are not afforded by proprietary social media platforms. At the same time, we want to use the opportunity to collaboratively inventorise the ways this could be measured or estimated. We will use our own experiences of the mastdodon servers we administer (https://assemblag.es (Tim) and https://post.lurg.org (Roel)) as a point of departure."
Day 2: Tools and Methods
This session will provide an overview of several available tools to research Mastodon/The Fediverse, followed by a conversation with all attendees.
In this session, Iain Emsley (CIM's Research Software Engineer) will demonstrate the rationale and current status of the research tool for the Fediverse he's leading.
This will be a collaborative discussion aimed to draft a Methods statement that addresses key questions such as:
- What have people researched so far?
- What should be avoided when researching Mastodon/The Fediverse?
- What kind of research relationship with the platform/community is to be established?
- What is it possible to do? What is not yet possible?
This will be a hands-on group session to explore new projects and discuss how to support and further develop research tools for the fediverse.
Equality Diversity and Inclusion
This workshop aims to promote equality, diversity, and inclusion. We are quite conscious that the programme is gender imbalanced and most of the speakers are from the global north. We invite everyone to reflect on this throughout the event, to be aware of their own positionality, and to ensure all speakers are given the opportunity to fully participate.
Funding
This event has been funded by University of Warwick's Research Development Fund (RD22009), the Centre for Digital Inquiry and the Sustainable Cities GRP
Guest Speaker: Robert W. Gehl
Robert W. Gehl is the Ontario Research Chair of Digital Governance for Social Justice at York University, and an alumnus of the Fulbright Canada Research Chair program. He is also an Adjunct Associate Professor in the Department of Communication, Media, and Film at the University of Calgary.
His research is about network cultures and technologies, manipulative communication, alternative social media, and the Dark Web, and as such, he has been one of the first scholars researching the Fediverse.