Analysis & Publications

2021 — 20202019201820172016


event Published in: May

E-privacy 2021, Italian conference on privacy and digital rights

Due esempi di bias algoritmici: la polarizzazione su Youtube e l'eteronomartività su Pornhub. Gli algoritmi sono una soluzione tecnologica al sovraccarico di informazioni: sono tanto potenti quanto necessari per gestire l'overflow di dati che ci raggiunge. Purtroppo, possono anche nascondere l'uso di valutazioni e giudizi basati su bias che hanno un impatto sulla diffusione delle idee e della cultura. Tracking Exposed si occupa da diversi anni di rendere queste black box analizzabili in modo indipendente, sia per le ricercatrici che per utenti comuni. In questo intervento discuteremo due degli studi più recenti che abbiamo condotto sugli algoritmi di raccomandazione di Youtube e Pornhub.

Authors: Salvatore Romano

paper Published in: May

YTTREX: Crowdsourced Analysis of YouTube’s Recommender System During COVID-19 Pandemic

The youtube collaborative analysis we did in March 2020, just right at the beginning of Covid pandemic, got published by Springer. Check down this page for the free to download .pdf

Authors: Leonardo Sanna, Salvatore Romano, Giulia Corona, Claudio Agosti

See also: Project announcement and update logs

link Extrenal resource Author: Davide Beraldo & Stefania Milan Published in: March

Political advertising exposed: tracking Facebook ads in the 2021 Dutch elections

A unique experiment that merges academics, challenges about passive platform scraping, and a newspaper. They analyze political messages in advertising, during the Dutch national campaign prior to elections.

link Extrenal resource Published in: February

Twitter thread summarizing the Youtube search query analysis

Regarding the January analysis on YouTube, a differently accessible explaination that you can RT ;P

Authors: Tracking Exposed

event Published in: January

FIlterTube: Investigating echo chambers, filter bubbles and polarization on YouTube — DMI UvA Winter School project pitch

Abstract: This paper studies the construction of filter bubbles and political polarization under YouTube 's algorithmic personalization, in a time where the political division runs deep in the US and the 2020 election reaffirms the polarization. Using artificially generated personalized user accounts, we find that search results differ according to users' political affiliations, both in terms of the media type and political ideology of the channels suggested, showing some empirical evidence of filter bubbles' existence on YouTube, which possibly exacerbates an echo chamber behavior and enhancing political polarization in the US political debate. Project coordinated by Salvatore Romano and Davide Beraldo, Giovanni Rossetti, Leonardo Sanna

Authors: Salvatore Romano

See also: Final presentation slides


Amazon algorithm analysis for product and search results:

Tracking Gender Bias in Amazon Search Results

Is the Same Everywhere?

Choose Your Price: Windows 10 vs. macOS

Does Amazon know your Wealth?

Amazon’s Choice: An inquiry into Amazon

link Published in: September

(Italian) radio interview before Tracking Exposed Workshop

DisruptionLab in Berlino organized a three day event on Data Tracking, with panel discussion, keynotes and workshop. Tracking Exposed host a workshop named 'smash the filter bubble'. Salvatore got interviewed by a Radio show in Colone that transmits also in Italian

See also: Workshop scheduleFacebook event

link Extrenal resource Author: Matthew Linears Published in: September

Designing data transparency – ideas from the community

YourData is openDemocracy’s project to bring more transparency to data use on the web. Personalisation is where websites shows you specific content dependent on data they have about you. Like showing you information about floral dresses because they think you’re a woman, or more articles about Bernie Sanders because you’re viewing from the US. -- this is the opening of YourData initiative from openDemocracy, as Tracking Exposed we partecipated with a few proposals

See also: articles from Matthew Linears

paper Extrenal resource Author: Dimitri Koehorst (UvA master thesis) Published in: September

Warehouse of information: Amazon's data collection practices and their relation to GDPR

In recent times, data has become increasingly central to a variety of different companies. While the use of data has become widespread, there are some companies whose entire business model revolves around the use of data. One such company is Amazon. Initially it was merely an online bookstore, but as the company grew it incorporated multiple new branches, such as Amazon Web Services, which allow the company to collect data from a variety of different sources. Companies such as Amazon use this data to optimize their services, which allows them to gain certain advantages over their competitors. However, this usage of data is bound by international regulations, one of which is the GDPR, the new data protection legislation of the European Union. By using data collected from the webstore as a case study, this thesis investigates the shift of companies towards a data-oriented business model, and investigates certain problems that this shift brings. This is done through the research question: How can we conceptualize the data collection practices of Amazon in relation to the General Data Protection Regulation?

paper Opinion piece by: Leonardo Sanna Published in: June

Implementing Eco’s Model Reader with Word Embeddings. An Experiment on Facebook Ideological Bots

First outline of a methodology for computational pragmatics using FBTREX dataset. In the paper, we found that the algorithm creates different model readers for each relevant theme in the dataset and that the right-wing ideology was dominant cross-profiles

link Published in: Summer

Youtube collaborative observation

We apply the collaborative observation to YouTube, regardless the management and personalization of COVID-19 informative videos; we released open data, improved technology, and wrote a paper with preliminary findings.

See also: preprint paper (14 pages)analysis updates

paper Opinion piece by: Urbano Reviglio, Claudio Agosti Published in: April

Thinking Outside the Black-Box: The Case for “Algorithmic Sovereignty” in Social Media

This article is an interdisciplinary critical analysis of personalization systems and the gatekeeping role of current mainstream social media. The first section presents a literature review of data-driven personalization and its challenges in social media. The second section sheds light on increasing concerns regarding algorithms’ ability to overtly persuade—and covertly manipulate—users for the sake of engagement, introducing the emergence of the exclusive ownership of behavioral modification through hyper-nudging techniques. The third section empirically analyzes users’ expectations and behaviors regarding such data-driven personalization to frame a conceptualization of users’ agency. The fourth section introduces the concept of “algorithmic sovereignty.” Current projects that aim to grant this algorithmic sovereignty highlight some potential applications. Together this novel theoretical framework and empirical applications suggest that, to preserve trust, social media should open their personalization algorithms to a social negotiation as the first step toward a more sustainable social media landscape. To decentralize the immense power of mainstream social media, guarantee a democratic oversight, and mitigate the unintended undesirable consequences of their algorithmic curation, public institutions and civil society could help in developing and researching public algorithms, fostering a collective awareness so as to eventually ensure a fair and accountable “algorithmic sovereignty.”

link Published in: March

Pornhub collaborative observation

Our team worked on a new concept for algorithm analysis: collaborative observation. A bunch of people performs the same operation for 24 hours, and we began this experiment why watching Pornhub personalizaton and recommendation algorithm.

See also: web slides

article Opinion piece by: Syver Petersen Published in: March

Exploring Facebook’s role in Ethiopia’s rising ethnic tensions

Africa’s second-most populous country is undergoing a political revolution forcing old and new ethnic grievances to the surface. A blog post on how Facebook is involved in Ethiopia’s political transition, and how fbtrex tool support the exploration of information diets differences, between the country’s largest ethnic groups

See also: blogpost

event Opinion piece by: Salvatore Romano Published in: January

YouTube Tracking Exposed: Investigating polarization via YouTube ’s Recommender Systems — DMI UvA Winter School project pitch

Collective group work on polarization of the Brexit discussion seen through Youtube's personalization algorithm, we found out that: (1) There is evidence of progressive polarization of the recommendations around Brexit on YouTube, especially for Leave-inclined users.(2) The Leave/Remain content bubbles, constituted respectively by The Sun/The Telegraph and The Guardian/The Mirror YouTube channels rarely converge. (3) Mainstream media is recommended with greater regularity compared to natively digital channels. Project coordinated by Salvatore Romano and Davide Beraldo

See also: Final project reportProject Pitch slides (on Prezi)Final presentation slides

article Opinion piece by: Salvatore Romano Published in: January

Youtube Tracking Exposed — DMI UvA Winter School Tutorial

Tutorial to explain the possible uses of the ytTREX tool, try it at

Authors: Salvatore Romano

See also: Tutorial slides


event Opinion piece by: Salvatore Romano Published in: December

Assembly with the Amazon's workers of ADL Cobas grassroots trade union

An informal discussion with some Amazon's workers inscribed to the grassroots trade union ADL Cobas Padova-Bassa Padovana and American activists from the Amazon Employees for Climate Justice group. After a brief introduction about amTREX tool, we discussed how Amazon's app tracks employees, trying to identify strategies to reduce the amount of data extracted and reflecting on potential GDPR compliance used as a tool for trade union negotiations. Partecipated by Salvatore Romano for trex.

event Published in: December

RAI national television with (Italian Documentary on Amazon)

A long documentary on Amazon empire, and our original research on how Algorithm accountability tools might handy to infer personal data usage in personalization algorithms. Our research display and explain with the video on surveillance capitalism and tools for personal investigation. The report is quite basic and lack of the robustness of a large scale tool. -- Featuring Claudio Agosti, Riccardo Coluccini, Giulia Corona, Salvatore Romano. To see the video you have to write 'trex'

See also: Our new supported platform

event Published in: November

Porno, Algoritmi e Tordimatti!

A special event to annount! in Italian language. We tried a new format, looking forward to replicate.

event Published in: October

KiKK - The resistance against algorithm monopoly

How many of your information comes from Youtube or Facebook? Internet is born as a decentralized network of knowledge and technologies, but nowadays, two corporations become our cultural reality. This talk will try to make understand the power exerted by the online platforms. As society, we are not following it, seeing it, fear it, and then regulate and adjudicated. Claudio Agosti will talk about, a free software project means to enable people in understanding, play, and criticize how algorithms interfere with reality's perception.

event Published in: October

World Forum Democracy

Social media are at the core of information nowadays. This lab will tackle the pressing issue that is quality control of shared information in social media, through monitoring and accountability mechanism mainly. How can we use social media as an ally for critically assessing topical subjects? How do we hold them accountable for the information that goes through them? Is social media moderation and freedom of expression compatible? -- Leonardo Sanna has been a contract doctoral student at the University of Modena and Reggio Emilia (Italy) since November 2018, where he has been working on the analysis of Big Data from a semiotic perspective. His research focuses on a combination of quantitative and qualitative methods for social media analysis. Currently, he is studying, on Facebook, the two phenomena known as 'filter bubble' and 'echo chamber'. In particular, he works on the data of the FBTREX group.

Authors: Leonardo Senna

event Published in: September

Beyond Future design

Accountability and AI

article Extrenal resource Author: Alex Fanta Published in: July

Facebook's Algorithm Shapes Our Lives. This Hacker Wants to Find Out How.

Netzpolitik interview to Claudio Agosti on the Tracking Exposed project and plan

link Published in: June

algorithm exposed: Youtube — DMI UvA Summer School

a dozen of scholars try to measure how YouTube algorithm personalize the 'related' video list

See also: Final report

link Extrenal resource Author: DATACTIVE Published in: May

when corporation pretend to help: Why we need data activism

The statement on the EU19 tracking exposed project website portrays why academic research should not be delimited by corporate conditions for research only; we should engage in independent critical research to platforms that important for our online public democratic spaces.

Authors: Claudio Agosti

See also: ALEX blogpost

article Extrenal resource Author: Alex Fanta Published in: May Language: German

Facebooks Algorithmus formt unser Leben. Dieser Hacker will herausfinden wie.

Netzpolitik interview to Claudio Agosti on the Tracking Exposed project and plan

link Extrenal resource Author: Privacy International Published in: May

Data Exploitation in the Italian Elections

An inclusion of our analysis into a collection of tools to asses misinformation in electoral campaigns

Authors: Fabio Chiusi, Claudio Agosti

link Extrenal resource Author: Berlin DSSG

Datathon organized with berlin Data Science Social Good

data scientists analyzing one year of fbtrex data. We did a privacy assessment and defined minimization and confidentiality agreement, as collaborator of the project for the time of the experiment.

See also: Final presentation (slides)

article Extrenal resource Author: Paola Pietrandrea Published in: April Language: French

Devoiler Les Algorithmes Pour Sortir De Nos Bulles

The personalization algorithms used by social networks induce a segregation effect. In this post, the journalist assemble and analyze few articles of ours, and integrate with answers from Claudio, Umberto, Stefania and Federico.

article Published in: May

Popping the Bubble

Don't delete your facbook profile - give it to science. An essay explaining vision, results and goals

Authors: Umberto Boschi, Federico Sarchi

article Published in: February

Personalization algorithms and elections: breaking free of the filter bubble

Personalisation algorithms allow platforms to carefully target web content to the tastes and interests of their users. They are at the core of social media platforms, dating apps, shopping and news sites. In this Op-ed on Internet Policy Review we share the project vision.

Authors: Stefania Milan and Claudio Agosti

video Published in: February

Transmediale: Affects Ex-Machina: Unboxing Social Data Algorithms

Conventional media have long filtered information and influenced public opinion. In the age of social media, this process has become algorithmic and targeted, separating the whole of society into thousands of small filter bubbles that construct collective orientations and pilot viral phenomena. This panel examines how machine learning and obscure algorithms analyze and manipulate individual affects into political sentiments, eventually amplifying class, gender, and racial bias ― with Claudio Agosti, Ariana Dongus, Nayantara Ranganathan, Caroline Sinders. Organized by KIM | HfG Karlsruhe

See also: Video

event Published in: February

How to unmask and fight online manipulation

at the EDPS working group against misinformation. We highlight how research can use it and assess proper responsibilities to the actors in the misinformation chain. Platform are not neutral, we were looking how algorithm affects the information flows.

event Published in: January

CPDP - Safeguarding elections an international problem with no international solution

Coordinated by TacticalTech. ― There is a growing body of research into data-driven elections world-wide and the international nature of the data and elections industry has been highlighted: from international platforms, to strategists in one country advising political groups in another, to paid targeted ads across borders. ― Ailidh Callander, Claudio Agosti, Paul Bernal, Victoria Peuvrelle

See also: Video

event Published in: January

PrivacyCamp - Towards real safeguards: Data driven political campaigns and EU election

This panel aims to evaluate potential preventive mechanisms such as Facebook algorithmic transparency around online political targeting, EU Commission’s Action Plan against Disinformation, awareness raising on current and future campaigning practices, as well as efforts to protect media pluralism and freedom. ― With Fanny Hidvegi, Elda Brogi, Claudio Agosti, Josh Smith and Eleonora Nestola

link Published in: January

Facebook algorithm analysis during the European Election: a campaign

Our goal and experiment were to build a replicable campaign. Researchers or activist are invited to reach out to us; we can help in replicate the campaign in times of conflicts, electoral campaign, or general observation on how forces distort the perception of the public debate.

See also: Action plan wrote in November 2018


Facebook Algorithm Exposed, DMI UvA Winter School

An experiment with a dozen of scholars, in keeping bots alive, test algorithm, see and play with data

Authors: Giovanni Rossetti, Bilel Benbouzid, Davide Beraldo, Giulia Corona, Leonardo Sanna, Iain Emsley, Fatma Yalgin, Hannah Vischer, Victor Pak, Mathilde Simon, Victor Bouwmeester, Yao Chen, Sophia Melanson, Hanna Jemmer, Patrick Kapsch, Claudio Agosti, Jeroen de Vos

See also: slides

article Extrenal resource Author: Par Martin Untersinger et Pauline Croquet Published in: January Language: French

Réseaux sociaux, données personnelles, algorithmes… comment inventer un futur numérique plus radieux ?

A generalist/technological article from/for Lemond, written by two journalists at the CCC (see below)


video Published in: December

CCC — Analyze the Facebook algorithm and reclaim algorithm sovereignty

Facebook monopoly is an issue, but looking for replacements it is not enough. We want to develop critical judgment on algorithms, on why data politics matter and educate, raise awareness for a broad audience.

See also: slidesvideo

paper Published in: November

Fairness in online social network timelines: Measurements, models and mechanism design

(PEVA) Performance Evaluation 2018. DOI:10.1016/j.peva.2018.09.009

Authors: Eduardo Hargreaves, Claudio Agosti, Daniel Menasche, Giovanni Neglia, Alexandre Reiffers-Masson, and Eitan Altman

paper Published in: October

Biases in the Facebook News Feed: a Case Study on the Italian Elections

Fosint-SI 2018, in conjunction with ASONAM 2018, Proceedings of the 2018 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining

Authors: Eduardo Hargreaves, Claudio Agosti, Daniel Menasche, Giovanni Neglia, Alexandre Reiffers-Masson, and Eitan Altman

See also: DOI

paper Published in: July Language: Brazilian

Visibilidade no Facebook: Modelos, Medições e Implicações

Proceedings of the Brazilian Workshop on Social Network Analysis and Mining

Authors: Brasnam, Eduardo Hargreaves, Daniel Sadoc Menasché, Giovanni Neglia, and Claudio Agosti

See also: Paper

paper Published in: July

Italian political election and digital propaganda

TacticalTech publish a report written by Claudio Agosti and Fabio Chiusi

See also: Open data

paper Extrenal resource Author: WebFoundation Published in: April Language: English, Spanish

The invisible curation of content | Facebook’s News Feed and our information diets

WebFoundation released a report produced by a joint collaboration; We performed a test in Argentina, release open data and analysis of six profiles run by us. The experiment was meant to measure the algorithm influence on the perception of public debate.

Authors: Renata Ávila, Juan Ortiz Freuler and Craig Fagan. Claudio Agosti and the Facebook Tracking Exposed team

See also: Open data

link Published in: April Language: Italian

Italian election 2018, our research output

An original analysis with profiles under our control. A fascinating series of discovery on how to measure the algorithm space. This is the same analysis we will talk about in the following 10 months page, because iconic, insightful, and pretty hard to coordinate.

Authors: Federico Sarchi, Claudio Agosti, Costantino Carugno, Barbara Gianessi, Riccardo Coluccini, Raffaele Angus, Laura Boschi, Gianluca Oldani, Umberto Boschi, Manuel d’Orso


event Published in: November

World Forum Democracy — Bursting social media eco chambers

The lab will examine the detrimental effects of social media filter bubbles and algorithms and will explore solutions to make readers more aware of their reading habits and help them to integrate different worldviews.

See also: Laboratory: two pages final report

article Published in: October

Could populism be a side effect of the Personalized Algorithm?

A rampant speculation in the title, and a more rational analysis on how to display impact of algorithms to social media users.

article Opinion piece by: Luca Corsato Published in: October

video Published in: July

SHA2017 — Exposing what Facebook wants you to see

A talk about our early version of fbTREX, after 1 year of existence

See also: Slides

article Published in: May

Facebook algorithm and impact on media: French election experiment #1

for the first time we used bots, or, dummy—profiles—under—our—control to test the algorithm discrimination.

Authors: Claudio Agosti, Raffaele Angus

article Extrenal resource Author: Andrea Gentili Published in: April

The algorithm medley: explaining

At the International Journalism Festival, their magazine covered the talk (see below)

event Published in: April

Panel discussion "exposing what Facebook wants you to see"

International Journalism Festival, Perugia

Authors: Renata Avila, Federico Sarchi, Claudio Agosti

See also: Video

article Extrenal resource Author: Sanne Terlingen Published in: March Language: Dutch

Deze tool checkt of Facebook écht de verkiezingen beïnvloedt

Netherland elections were our first public experiment. Has been partially a failure because we understood how different are profiles around the social network. This, and the language barrier, made any analysis not insightful enough to be reported.

event Published in: January

Torino Hack Night

Toolbox Coworking, Turin

Authors: Constantino Carugno, Gilberto Conti


event Published in: November project announcement

At c-base, Berlin, one of the first video of fbTREX in the wild, when the beta version was beginning

See also: Web slides


We got the first logo(s)

Luca Corsato built opensensorsdata (OSD) with Andrea Raimondi and Simone Cortesi. They been the first very sponsor of Tracking Exposed. Among other helps, the day before the presentation Luca sent the first logo. Yet already declined for Twitter and Youtube.

event Published in: October (code show-off)

At the C-Base Hack'n'Tell, when Alberto won the monthly price, our new web-extension was released!

event Published in: October (project pitch)

At PyData the very first presentation of Alberto, when he started to develop the new web-extension

Authors: Alberto Granzotto

video Published in: September

a GIF!!

An animated gif explaining our project, alpha stage (RARE! don't watch it too much)

Authors: Michele Invernizzi, Density Design of Politecnico Milan

event Published in: September Language: Italian

Cyber Resistance in 2016 consist in doing algorithm reversing!

This is the project inception to the public! The original title was complete by saying 'not encryption anymore', but might sounds misleading. Encryption is a fundamental element for protection, simply, the impact of social media in our perception of reality is unmeasured, subtle, and potentially scaring. But is not for fear this call, is because, with centralization, we lose as individual the ability to control our own algorithm. P.S. Despite this is first appearance of the project in public, the very first birthday was here: !

Authors: Claudio Agosti