A New HOPE (2022): Porn Platforms Hate Them for Exposing Their Mischief With These Two Weird Tricks
The non-profit organization Tracking Exposed (tracking.exposed/), which fosters digital rights and algorithm accountability, has developed a set of free-software tools (Potrex and Guardoni) with the intent of bringing light into the underlying mechanisms of one of the major porn platforms existing nowadays. Thanks to these tools, Giulia and Alessandro have achieved an unprecedented angle of view over biases and data processing malpractices that may affect these websites, collecting precious evidence that has proven useful for carrying out academic research and even digital forensics investigations. Their goal is to give empowerment to the users and help them reclaim their rights recognized by the European General Data Protection Regulation (GDPR) and even more. During this talk, they will present the research they have conducted regarding the abuses spotted on a porn platform whose algorithms seem to be operating in a seriously biased way. They will then explore signs of possible data protection law violations and will imagine together strategies and methodologies for the upcoming analysis of these platforms.
YouChoose.AI: break free from YouTube’s algorithm monopoly with adversarial interoperability
YouChoose.AI is a browser extension that enables users to choose alternative recommendation feeds directly on YouTube.com. This is a critical step towards digital sovereignty. We should be empowered to choose and customize their recommendation system. By gaining back agency over the content delivery infrastructure, we can gain control back over our informational diets. The first alternative feed available on YouChoose is provided by the content creators themselves, who can select the recommendations on their own videos. This is a liberating change for content creators, who are currently at the mercy of YouTube’s algorithm to distribute their content. YouTube benefits from its nearly monopolistic position on the video-sharing market to impose its recommendation algorithm. YouChoose challenges this de-facto algorithmic monopoly by making third-party alternatives interoperable with the platform. Since the regulation does not currently impose interoperability, YouChoose plugs itself on top of YouTube in an adversarial way.
Tracking Exposed: a tool for TikTok algorithmic audits and cross-national comparisons
We kicked off with our first talk on TikTok and its growing, unaccountable role in geopolitics. We demo’d our TikTok browser extension, which is is a free (and open) software that monitors TikTok’s recommendation algorithm behavior and personalization patterns. Part of our Mozilla Foundation-funded TikTok Observatory, our tool enables researchers and journalists to investigate which content is promoted or demoted on the platform, including content regarding politically sensitive issues.
E-privacy 2021, Italian conference on privacy and digital rights
Due esempi di bias algoritmici: la polarizzazione su Youtube e l'eteronomartività su Pornhub. Gli algoritmi sono una soluzione tecnologica al sovraccarico di informazioni: sono tanto potenti quanto necessari per gestire l'overflow di dati che ci raggiunge. Purtroppo, possono anche nascondere l'uso di valutazioni e giudizi basati su bias che hanno un impatto sulla diffusione delle idee e della cultura. Tracking Exposed si occupa da diversi anni di rendere queste black box analizzabili in modo indipendente, sia per le ricercatrici che per utenti comuni. In questo intervento discuteremo due degli studi più recenti che abbiamo condotto sugli algoritmi di raccomandazione di Youtube e Pornhub.
Transmediale: Affects Ex-Machina: Unboxing Social Data Algorithms
Conventional media have long filtered information and influenced public opinion. In the age of social media, this process has become algorithmic and targeted, separating the whole of society into thousands of small filter bubbles that construct collective orientations and pilot viral phenomena. This panel examines how machine learning and obscure algorithms analyze and manipulate individual affects into political sentiments, eventually amplifying class, gender, and racial bias ― with Claudio Agosti, Ariana Dongus, Nayantara Ranganathan, Caroline Sinders. Organized by KIM | HfG Karlsruhe
CPDP - Safeguarding elections an international problem with no international solution
Coordinated by TacticalTech. ― There is a growing body of research into data-driven elections world-wide and the international nature of the data and elections industry has been highlighted: from international platforms, to strategists in one country advising political groups in another, to paid targeted ads across borders. ― Ailidh Callander, Claudio Agosti, Paul Bernal, Victoria Peuvrelle
PrivacyCamp — Towards real safeguards: Data driven political campaigns and EU election
This panel aims to evaluate potential preventive mechanisms such as Facebook algorithmic transparency around online political targeting, EU Commission’s Action Plan against Disinformation, awareness raising on current and future campaigning practices, as well as efforts to protect media pluralism and freedom. ― With Fanny Hidvegi, Elda Brogi, Claudio Agosti, Josh Smith and Eleonora Nestola
CCC — Analyze the Facebook algorithm and reclaim algorithm sovereignty
Facebook monopoly is an issue, but looking for replacements it is not enough. We want to develop critical judgment on algorithms, on why data politics matter and educate, raise awareness for a broad audience.
SHA2017 — The quest for algorithm diversity
Panel discussion: “Exposing what Facebook wants you to see”.
facebook.tracking.exposed project announcement
At c-base, Berlin, one of the first video of fbTREX in the wild, when the beta version was beginning
Cyber Resistance in 2016 consist in doing algorithm reversing!
This is the project inception to the public! The original title was complete by saying 'not encryption anymore', but might sounds misleading. Encryption is a fundamental element for protection, simply, the impact of social media in our perception of reality is unmeasured, subtle, and potentially scaring. But is not for fear this call, is because, with centralization, we lose as individual the ability to control our own algorithm. P.S. Despite this is first appearance of the project in public, the very first birthday was here: https://moca.olografix.org/en/moca-en/ !