E-privacy 2021, Italian conference on privacy and digital rights
Due esempi di bias algoritmici: la polarizzazione su Youtube e l'eteronomartività su Pornhub. Gli algoritmi sono una soluzione tecnologica al sovraccarico di informazioni: sono tanto potenti quanto necessari per gestire l'overflow di dati che ci raggiunge. Purtroppo, possono anche nascondere l'uso di valutazioni e giudizi basati su bias che hanno un impatto sulla diffusione delle idee e della cultura. Tracking Exposed si occupa da diversi anni di rendere queste black box analizzabili in modo indipendente, sia per le ricercatrici che per utenti comuni. In questo intervento discuteremo due degli studi più recenti che abbiamo condotto sugli algoritmi di raccomandazione di Youtube e Pornhub.
Transmediale: Affects Ex-Machina: Unboxing Social Data Algorithms
Conventional media have long filtered information and influenced public opinion. In the age of social media, this process has become algorithmic and targeted, separating the whole of society into thousands of small filter bubbles that construct collective orientations and pilot viral phenomena. This panel examines how machine learning and obscure algorithms analyze and manipulate individual affects into political sentiments, eventually amplifying class, gender, and racial bias ― with Claudio Agosti, Ariana Dongus, Nayantara Ranganathan, Caroline Sinders. Organized by KIM | HfG Karlsruhe
CPDP - Safeguarding elections an international problem with no international solution
Coordinated by TacticalTech. ― There is a growing body of research into data-driven elections world-wide and the international nature of the data and elections industry has been highlighted: from international platforms, to strategists in one country advising political groups in another, to paid targeted ads across borders. ― Ailidh Callander, Claudio Agosti, Paul Bernal, Victoria Peuvrelle
PrivacyCamp — Towards real safeguards: Data driven political campaigns and EU election
This panel aims to evaluate potential preventive mechanisms such as Facebook algorithmic transparency around online political targeting, EU Commission’s Action Plan against Disinformation, awareness raising on current and future campaigning practices, as well as efforts to protect media pluralism and freedom. ― With Fanny Hidvegi, Elda Brogi, Claudio Agosti, Josh Smith and Eleonora Nestola
CCC — Analyze the Facebook algorithm and reclaim algorithm sovereignty
Facebook monopoly is an issue, but looking for replacements it is not enough. We want to develop critical judgment on algorithms, on why data politics matter and educate, raise awareness for a broad audience.
SHA2017 — The quest for algorithm diversity
Panel discussion: “Exposing what Facebook wants you to see”.
facebook.tracking.exposed project announcement
At c-base, Berlin, one of the first video of fbTREX in the wild, when the beta version was beginning
Cyber Resistance in 2016 consist in doing algorithm reversing!
This is the project inception to the public! The original title was complete by saying 'not encryption anymore', but might sounds misleading. Encryption is a fundamental element for protection, simply, the impact of social media in our perception of reality is unmeasured, subtle, and potentially scaring. But is not for fear this call, is because, with centralization, we lose as individual the ability to control our own algorithm. P.S. Despite this is first appearance of the project in public, the very first birthday was here: https://moca.olografix.org/en/moca-en/ !