CIM Early Career Researchers (ECRs) are happy to invite you to the first DigiMeth festival, a one-day fair of digital methodologies hosted by the Centre of Interdisciplinary Methodologies at the University of Warwick on December 9, 2022.
DigiMeth festival is a free space for discussion and hands-on trial of cutting-edge digital methodologies in social science, open to interdisciplinary exchange between researchers from different backgrounds. Master’s, and PhD students, and anyone else interested in learning digital methodologies to work with social media or digital platforms data are welcome to join.
Digital data generated by private platforms is a pervasive framework for human action, and research into digital data and platform influence is increasingly critical (Burrows and Savage, 2014; Venturi et al. 2018; Savage and Burrows, 2007; Zuboff 2019). Drawing from a long tradition of methodological discussions on digital research methods (Marres and Gerlitz, 2016; Marres, 2017; Rogers, 2013; Venturini et al. 2018), the Festival creates the opportunity for discussions and reflections on how methodologies developed within academic and activist contexts can help us better understand power dynamics in digital data, and the Internet and social media as a dynamic space of research.
The event includes a keynote and three workshops on different methodologies for digital data collection and analysis.
In the morning:
- Keynote delivered by Claudio Agosti. “A taxonomy of analytical approaches for algorithmic analysis”. H: 10 - 11 AM. Duration: 45 minutes. You can find more details about it below.
- Workshop delivered by Claudio Agosti and Salvatore Romano. “TikTok and Shadowban mechanisms”. H: 11.15 - 12.45 AM. Duration: 1:30 hour. You can find more details about it below.
In the afternoon:
- Workshop delivered by Warren Pearce. “Web Data Research Assistant (WDRA): a versatile plug-in for Twitter scraping”. H: 14 - 15.30 PM. Duration 1:30 hour. You can find more details about it below.
- Workshop delivered by Janna Joceli Omena and Beatrice Gobbo: “Visual Network Narrations”. H: 15.45 - 17.15 PM. Duration 1:30 hour. You can find more details about it below
Light refreshments will be provided through the day.
In this keynote, Claudio Agosti from Trex will illustrate some of the main findings generated by their infrastructure for the algorithmic analysis of social platforms started in 2016.
In order to understand and describe the actual power of algorithms, Claudio will present Trex results, and compare multiple forms of algorithms' influence and the different ends of the customization spectrum. The results include:
- Amplitude of information received. First identified in 2017 by Trex researchers on the Facebook platform, this parameter is also known as ‘frequency of variation’ and acts persistently on users’ profiles. We have observed how information width is a variable parameter depending on the profile analysed: some of them are persistently targeted by the same posts, while others are exposed to constantly different content.
- News Quality Ecosystem. Present on the Facebook platform, this parameter indicates a group of mainstream media whose content, in spite of the number of likes and interactions associated with it, is prioritized in NewsFeed over other media.
- Semiotic percentage. Our analysis of Facebook allowed us to discover how the social network arbitrarily sets a specific amount of text, videos, and images to be shown to each user. This composition of the timeline can be measured in percentages (i.e. 50% videos, 30% images, 20% texts), and for the same number of accesses, displayed posts, and availability of content, Facebook keeps these proportions almost constant over time.
- Future exposure. User profiling is the essential element of the personalization techniques implemented by YouTube. The recommendations proposed to users are based on their past activities and linguistic characteristics. The result of searches is also personalized and is influenced by the consumption of content that produces a divergence of perception between two equal profiles.
- Shadowban. This information control technique, implemented by each platform, is aimed at decreasing the visibility of content considered to be problematic. Trex allowed us to study methods that allowed us to measure the effects of Shadowban on YouTube and TikTok: this research allowed us to understand what content was affected and how its perception was transformed.
- A/B Testing. Pornhub makes use of personalization in two different cases. The first is related to the homepage, where the user’s interests are assessed extremely quickly and durably, without any consent being granted by the user. The second is inherent to the longevity of a video in its database: we observed how Pornhub implements an A/B testing system on newly published videos to refine its recommendations to users.
- Amazon officially does not change prices on the basis of individual profiling, but in fact, allows the seller to use extremely precise APIs to offer strategic discounts on a regional and behavioural basis.
Trex research findings have been documented through open data, are based on methodologies which can be replicated on different platforms and rely on free software available on GitHub (together with other academic publications and reports). The tools have been conceptualized, developed, fine-tuned, and tested, first in the laboratory and then in real-life scenarios (Europe, the United States and Latin America). Because of their open and modular nature, they can be integrated into any existent platforms controlled by artificial intelligence (such as Facebook, YouTube, PornHub, Amazon and TikTok).
Facilitator: Salvatore Romano
The workshop will show the participants how to collect and analyze data to investigate TikTok's algorithm.
The TikTok Tracking Exposed is a free (and open) software to monitor TikTok's recommendation algorithm behaviour and personalization patterns.
It enables researchers and journalists to investigate which content is promoted or demoted on the platform, including content regarding politically sensitive issues
With the browser extension installed, every TikTok video watched from that browser would be saved on a personal page as long as the suggested videos. Later on, the investigator can retrieve the evidence in CSV format by using the public API to compare two different profiles.
The workshop will consist in a collective live experiment, where each participant will have the chance to install the software and collect data in a cross-national analysis of the recommender system.
In the end, we'll analyze the data collected together and use a toolchain for data analysts based on Gephi and Python Notebook (or with a more straightforward tool such as Excel) to get the first results immediately.
- research on keyword geographic blocking (DMI22)
- report n.1 on Russian war and tiktok
- report n.2 on Russian war and tiktok
- report n.3 on Russian war and tiktok
Please bring your own computer with the Chrome browser installed and a TikTok account (personal or new)
For the data analysis we suggest installing Gephi (gephi.org) and the additional Circular Layout Plugin (https://gephi.org/plugins/#/plugin/circularlayout)
Facilitator: Warren Pearce (University of Sheffield)
This tutorial introduces WDRA, a Chrome browser plug-in that quickly and easily collects Twitter data via the platform’s Advanced Search function. While WDRA is less powerful than tools that directly access the Twitter API (e.g. DMI-TCAT or 4CAT), it does also have some important benefits:
- WDRA can collect some historical data using Twitter’s Advanced Search function. The data collected is more basic than TCAT, but it will allow you to do some exploratory work, perform some basic analysis and, if you have some technical ability, restore some additional details in the data with other tools.
- WDRA allows easy collection of data following users rather than keywords/hashtags.
- Using Advanced Search allows data collection via the ‘Top Tweets’ algorithm, which influences the visibility of content on Twitter.
- It does not require a server installation. Instead, WDRA runs as a plug-in to the Chrome browser, enabling basic tweet collection from your desktop.
By the end of this tutorial, you will be able to use WDRA to collect data and save it to your computer, be aware of the methodological implications of using this tool related to data accessibility and quality, and when WDRA should or should not be used for your research project.
Facilitators: Janna Joceli Omena, Beatrice Gobbo
This workshop offers methodological guidance for narrating networks through visual network analysis (VNA) (Venturini et al. 2021) and a technicity perspective to the practice of digital methods (Omena 2021). It is divided into two parts. First, we will introduce what questions we should ask to make sense of network building and the key principles of VNA. Second, students will work on digital and printed recommendation networks aiming at narrating what they see.
- Students will be able to explore and identify the main components of a digital network
- Students will reflect on the distinction between what is network exploration (description tasks) and network narration (insights, findings)
- Students will develop the ability to tell a story about the topic under investigation and what constitutes the network.
Please bring your own computer and get familiar with
Venturini, T., Jacomy, M., & Jensen, P. (2021). What do we see when we look at networks: Visual network analysis, relational ambiguity, and force-directed layouts. Big Data & Society, 8(1). https://doi.org/10.1177/20539517211018488
Omena, J.J.(2021). Digital Methods and Technicity-of-the-Mediums. From Regimes of Functioning to Digital Research. [Doctoral Dissertation, Nova University Lisbon]. Repositório da Universidade Nova de Lisboa. http://hdl.handle.net/10362/127961
Venturini, Tommaso & Bounegru, Liliana & Jacomy, Mathieu & Gray, Jonathan. (2017). 11. How to Tell Stories with Networks Exploring the Narrative Affordances of Graphs with the Iliad: Studying Culture through Data. 10.1515/9789048531011-014.
Sign up form
Claudio Agosti and Salvatore Romano:
Claudio and Salvatore are respectively founder/co-director and head of research of Tracking Exposed: a multidisciplinary group born in 2016 whose mission – you can read their manifesto here, is to create new research methodologies for investigating the power of algorithms, as well as disseminating tools and knowledge to face surveillance capitalism critically.
Since 2016 Trex has developed several tools to study and critically analyse platform algorithms' influence on user perceptions and understand algorithmic personalization processes. They collaborated with several academic institutions (such as DATACTIVE research group at the University of Amsterdam, where they formalised the concept of DATA DONATION) and international media outlets.
Warren is Senior Lecturer within iHuman, leading the Institute’s "Knowing Humans" research theme. His academic interests began with a BA in Geography & Politics at Sheffield, where he developed an interest in environmental politics. From 2012-2016, Warren was a Research Fellow at the University of Nottingham’s Institute for Science and Society working on Making Science Public, a wide-ranging five-year Leverhulme Trust programme focused on the relationship between science, politics and publics. He then joined the University of Sheffield as a Research Fellow in 2016, as Principal Investigator on the ESRC-funded Future Research Leaders project ‘Making Climate Social’, focusing on how climate change is represented and discussed on social media and other digital platforms.
Janna Joceli Omena:
Janna is a teaching fellow in digital methods at the Centre for Interdisciplinary Methodologies, University of Warwick. She is a member of the Public Data Lab and iNOVA Media Lab, where she founded and led the #SMARTDataSprint for six years. Her research focuses on the theory and practice of digital methods. Janna investigates the technicity of computational tools in digital research and their role in facilitating research and shaping the ways of knowing and thinking. She is currently working on methodologies for studying social media bots and network building and interpretation projects with platform data and computer vision.
Beatrice is an Assistant Professor in the Centre for Interdisciplinary Methodologies at the University of Warwick. She received her Ph.D. in Design from the Design Department of Politecnico di Milano where she was a member of the DensityDesign Lab. Her work and academic interests are positioned at the intersection between information design and computer science. In her PhD thesis “Embalming and Dissecting AI. Visual Explanations for the General Public” she describes a mixed methodology for approaching the Explainable Artificial Intelligence issue from a communication design perspective.