2022 From: Giulia Corona and Alessandro Polidoro Published in: July Language: English

A New HOPE (2022): Porn Platforms Hate Them for Exposing Their Mischief With These Two Weird Tricks

The non-profit organization Tracking Exposed (, which fosters digital rights and algorithm accountability, has developed a set of free-software tools (Potrex and Guardoni) with the intent of bringing light into the underlying mechanisms of one of the major porn platforms existing nowadays. Thanks to these tools, Giulia and Alessandro have achieved an unprecedented angle of view over biases and data processing malpractices that may affect these websites, collecting precious evidence that has proven useful for carrying out academic research and even digital forensics investigations. Their goal is to give empowerment to the users and help them reclaim their rights recognized by the European General Data Protection Regulation (GDPR) and even more. During this talk, they will present the research they have conducted regarding the abuses spotted on a porn platform whose algorithms seem to be operating in a seriously biased way. They will then explore signs of possible data protection law violations and will imagine together strategies and methodologies for the upcoming analysis of these platforms.

2022 From: Tracking Exposed at RightsCon 2022 Published in: May Language: English

YouChoose.AI: break free from YouTube’s algorithm monopoly with adversarial interoperability

YouChoose.AI is a browser extension that enables users to choose alternative recommendation feeds directly on This is a critical step towards digital sovereignty. We should be empowered to choose and customize their recommendation system. By gaining back agency over the content delivery infrastructure, we can gain control back over our informational diets. The first alternative feed available on YouChoose is provided by the content creators themselves, who can select the recommendations on their own videos. This is a liberating change for content creators, who are currently at the mercy of YouTube’s algorithm to distribute their content. YouTube benefits from its nearly monopolistic position on the video-sharing market to impose its recommendation algorithm. YouChoose challenges this de-facto algorithmic monopoly by making third-party alternatives interoperable with the platform. Since the regulation does not currently impose interoperability, YouChoose plugs itself on top of YouTube in an adversarial way.

2022 From: Tracking Exposed at RightsCon 2022 Published in: May Language: English

Tracking Exposed: a tool for TikTok algorithmic audits and cross-national comparisons

We kicked off with our first talk on TikTok and its growing, unaccountable role in geopolitics. We demo’d our TikTok browser extension, which is is a free (and open) software that monitors TikTok’s recommendation algorithm behavior and personalization patterns. Part of our Mozilla Foundation-funded TikTok Observatory, our tool enables researchers and journalists to investigate which content is promoted or demoted on the platform, including content regarding politically sensitive issues.

2021 From: Salvatore Romano Published in: May Language: Italian

E-privacy 2021, Italian conference on privacy and digital rights

Due esempi di bias algoritmici: la polarizzazione su Youtube e l'eteronomartività su Pornhub. Gli algoritmi sono una soluzione tecnologica al sovraccarico di informazioni: sono tanto potenti quanto necessari per gestire l'overflow di dati che ci raggiunge. Purtroppo, possono anche nascondere l'uso di valutazioni e giudizi basati su bias che hanno un impatto sulla diffusione delle idee e della cultura. Tracking Exposed si occupa da diversi anni di rendere queste black box analizzabili in modo indipendente, sia per le ricercatrici che per utenti comuni. In questo intervento discuteremo due degli studi più recenti che abbiamo condotto sugli algoritmi di raccomandazione di Youtube e Pornhub.

2019 From: Claudio Agosti Published in: February

2019 From: Claudio Agosti Published in: February

2019 From: Claudio Agosti Published in: January

2019 From: Claudio Agosti Published in: January

2018 From: Claudio Agosti Published in: December

CCC — Analyze the Facebook algorithm and reclaim algorithm sovereignty

Facebook monopoly is an issue, but looking for replacements it is not enough. We want to develop critical judgment on algorithms, on why data politics matter and educate, raise awareness for a broad audience.

2017 From: Claudio Agosti Published in: July

2017 Published in: April

Panel discussion: “Exposing what Facebook wants you to see”.

International Journalism Festival, Perugia

2017 From: Claudio Agosti Published in: November project announcement

At c-base, Berlin, one of the first video of fbTREX in the wild, when the beta version was beginning

2016 From: Claudio Agosti Published in: September Language: Italian

Cyber Resistance in 2016 consist in doing algorithm reversing!

This is the project inception to the public! The original title was complete by saying 'not encryption anymore', but might sounds misleading. Encryption is a fundamental element for protection, simply, the impact of social media in our perception of reality is unmeasured, subtle, and potentially scaring. But is not for fear this call, is because, with centralization, we lose as individual the ability to control our own algorithm. P.S. Despite this is first appearance of the project in public, the very first birthday was here: !