event Published in: 2021 - October
Harvard University: Faculty working group on Social Media Recommendation Algorithms
Round table discussion on social media and recommendation algorithms
Authors: Marc Faddoul
See also: Slides
event Published in: 2021 - October
Forum for Public and Common Goods @ Pre-COP 2021, Milan.
“Climate Change and New Technologies: Aligning Public Administration and Social Activation around Decentralized Local and Global Solutions”. We participated to the round table presenting the CO-Project: “Greenwashing on Youtube”
Authors: Salvatore Romano and Marc Faddoul
See also: Slides of the CO-Project
event Published in: 2021 - October
National Order of Journalists
Training to 50 Italian journalists to the National Order: “What is an algorithm and why we should care about it. Tools for data journalists”
Authors: Salvatore Romano, Ilir Rama, Daniele Salvini
See also: Slides
event Published in: 2021 - September
End Summer Camp ESC-21, Venice.
Talk about the last research conducted on Youtube and Pornhub. This was the occasion to show our last research on Pornhub algorithm and its heteronormativity.
Authors: Giulia Corona and Claudio Agosti
See also: heteronormativity and pornography slides
event Published in: 2021 - September
HackMeeting 2021, Italian hacking festival, Bologna.
Talk about the last research conducted on Youtube and Pornhub: Smash the filter bubble!. Workshop on Youtube: tracking the climate disinformation and the ranking algorithm suggestions. Presentation on the Ten minutes talk of the Youchoose project.
Authors: Salvatore Romano
See also: Workshop slides
event Published in: 2021 - May
E-privacy 2021, Italian conference on privacy and digital rights
Due esempi di bias algoritmici: la polarizzazione su Youtube e l'eteronomartività su Pornhub. Gli algoritmi sono una soluzione tecnologica al sovraccarico di informazioni: sono tanto potenti quanto necessari per gestire l'overflow di dati che ci raggiunge. Purtroppo, possono anche nascondere l'uso di valutazioni e giudizi basati su bias che hanno un impatto sulla diffusione delle idee e della cultura. Tracking Exposed si occupa da diversi anni di rendere queste black box analizzabili in modo indipendente, sia per le ricercatrici che per utenti comuni. In questo intervento discuteremo due degli studi più recenti che abbiamo condotto sugli algoritmi di raccomandazione di Youtube e Pornhub.
Authors: Salvatore Romano
event Published in: 2021 - April
High School seminary: “Digital education, social media and algorithms”, Milan.
Social media, algorithm, and psychology: raising awareness on the digital sphere among young adults
Authors: Salvatore Romano
See also: Associazione Psychè — Long documentary
event Published in: 2021 - January
FIlterTube: Investigating echo chambers, filter bubbles and polarization on YouTube — DMI UvA Winter School project pitch
Abstract: This paper studies the construction of filter bubbles and political polarization under YouTube's algorithmic personalization, in a time where the political division runs deep in the US and the 2020 election reaffirms the polarization. Using artificially generated personalized user accounts, we find that search results differ according to users' political affiliations, both in terms of the media type and political ideology of the channels suggested, showing some empirical evidence of filter bubbles' existence on YouTube, which possibly exacerbates an echo chamber behavior and enhancing political polarization in the US political debate. Project coordinated by Salvatore Romano and Davide Beraldo, Giovanni Rossetti, Leonardo Sanna
Authors: Salvatore Romano
See also: Final presentation slides
event Published in: 2020 - January
YouTube Tracking Exposed: Investigating polarization via YouTube ’s Recommender Systems — DMI UvA Winter School project pitch
Collective group work on polarization of the Brexit discussion seen through Youtube's personalization algorithm, we found out that: (1) There is evidence of progressive polarization of the recommendations around Brexit on YouTube, especially for Leave-inclined users.(2) The Leave/Remain content bubbles, constituted respectively by The Sun/The Telegraph and The Guardian/The Mirror YouTube channels rarely converge. (3) Mainstream media is recommended with greater regularity compared to natively digital channels. Project coordinated by Salvatore Romano and Davide Beraldo
Authors: Salvatore Romano
See also: Final project report — Project Pitch slides (on Prezi) — Final presentation slides
article Published in: 2020 - January
Youtube Tracking Exposed — DMI UvA Winter School Tutorial
Tutorial to explain the possible uses of the ytTREX tool, try it at https://youtube.tracking.exposed
Authors: Salvatore Romano
See also: Tutorial slides
event Published in: 2019 - December
Assembly with the Amazon's workers of ADL Cobas grassroots trade union
An informal discussion with some Amazon's workers inscribed to the grassroots trade union ADL Cobas Padova-Bassa Padovana and American activists from the Amazon Employees for Climate Justice group. After a brief introduction about amTREX tool, we discussed how Amazon's app tracks employees, trying to identify strategies to reduce the amount of data extracted and reflecting on potential GDPR compliance used as a tool for trade union negotiations. Partecipated by Salvatore Romano for trex.
Authors: Salvatore Romano
event Published in: 2019 - November
Porno, Algoritmi e Tordimatti!
A special event to annount pornhub.tracking.exposed! in Italian language. We tried a new format, looking forward to replicate.
event Published in: 2019 - October
KiKK - The resistance against algorithm monopoly
How many of your information comes from Youtube or Facebook? Internet is born as a decentralized network of knowledge and technologies, but nowadays, two corporations become our cultural reality. This talk will try to make understand the power exerted by the online platforms. As society, we are not following it, seeing it, fear it, and then regulate and adjudicated. Claudio Agosti will talk about tracking.exposed, a free software project means to enable people in understanding, play, and criticize how algorithms interfere with reality's perception.
event Published in: 2019 - October
World Forum Democracy
Social media are at the core of information nowadays. This lab will tackle the pressing issue that is quality control of shared information in social media, through monitoring and accountability mechanism mainly. How can we use social media as an ally for critically assessing topical subjects? How do we hold them accountable for the information that goes through them? Is social media moderation and freedom of expression compatible? -- Leonardo Sanna has been a contract doctoral student at the University of Modena and Reggio Emilia (Italy) since November 2018, where he has been working on the analysis of Big Data from a semiotic perspective. His research focuses on a combination of quantitative and qualitative methods for social media analysis. Currently, he is studying, on Facebook, the two phenomena known as 'filter bubble' and 'echo chamber'. In particular, he works on the data of the FBTREX group.
Authors: Leonardo Senna
event Published in: 2019 - September
Beyond Future design
Accountability and AI
link Extrenal resource Published in: 2019 - june
Datathon organized with berlin Data Science Social Good
data scientists analyzing one year of fbtrex data. We did a privacy assessment and defined minimization and confidentiality agreement, as collaborator of the project for the time of the experiment.
Authors: Berlin DSSG
See also: Final presentation (slides)
video Published in: 2019 - February
Transmediale: Affects Ex-Machina: Unboxing Social Data Algorithms
Conventional media have long filtered information and influenced public opinion. In the age of social media, this process has become algorithmic and targeted, separating the whole of society into thousands of small filter bubbles that construct collective orientations and pilot viral phenomena. This panel examines how machine learning and obscure algorithms analyze and manipulate individual affects into political sentiments, eventually amplifying class, gender, and racial bias ― with Claudio Agosti, Ariana Dongus, Nayantara Ranganathan, Caroline Sinders. Organized by KIM | HfG Karlsruhe
See also: Video
event Published in: 2019 - February
How to unmask and fight online manipulation
at the EDPS working group against misinformation. We highlight how research can use it and assess proper responsibilities to the actors in the misinformation chain. Platform are not neutral, we were looking how algorithm affects the information flows.
event Published in: 2019 - January
CPDP - Safeguarding elections an international problem with no international solution
Coordinated by TacticalTech. ― There is a growing body of research into data-driven elections world-wide and the international nature of the data and elections industry has been highlighted: from international platforms, to strategists in one country advising political groups in another, to paid targeted ads across borders. ― Ailidh Callander, Claudio Agosti, Paul Bernal, Victoria Peuvrelle
See also: Video
event Published in: 2019 - January
PrivacyCamp - Towards real safeguards: Data driven political campaigns and EU election
This panel aims to evaluate potential preventive mechanisms such as Facebook algorithmic transparency around online political targeting, EU Commission’s Action Plan against Disinformation, awareness raising on current and future campaigning practices, as well as efforts to protect media pluralism and freedom. ― With Fanny Hidvegi, Elda Brogi, Claudio Agosti, Josh Smith and Eleonora Nestola
event Published in: 2019 - January
Facebook Algorithm Exposed, DMI UvA Winter School
An experiment with a dozen of scholars, in keeping bots alive, test algorithm, see and play with data
Authors: Giovanni Rossetti, Bilel Benbouzid, Davide Beraldo, Giulia Corona, Leonardo Sanna, Iain Emsley, Fatma Yalgin, Hannah Vischer, Victor Pak, Mathilde Simon, Victor Bouwmeester, Yao Chen, Sophia Melanson, Hanna Jemmer, Patrick Kapsch, Claudio Agosti, Jeroen de Vos
See also: slides
video Published in: 2018 - December
CCC — Analyze the Facebook algorithm and reclaim algorithm sovereignty
Facebook monopoly is an issue, but looking for replacements it is not enough. We want to develop critical judgment on algorithms, on why data politics matter and educate, raise awareness for a broad audience.
event Published in: 2017 - November
World Forum Democracy — Bursting social media eco chambers
The lab will examine the detrimental effects of social media filter bubbles and algorithms and will explore solutions to make readers more aware of their reading habits and help them to integrate different worldviews.
See also: Laboratory: two pages final report
video Published in: 2017 - July
SHA2017 — Exposing what Facebook wants you to see
A talk about our early version of fbTREX, after 1 year of existence
See also: Slides
event Published in: 2017 - April
Panel discussion “exposing what Facebook wants you to see”
International Journalism Festival, Perugia
Authors: Renata Avila, Federico Sarchi, Claudio Agosti
See also: Video
event Published in: 2017 - January
event Published in: 2016 - November
facebook.tracking.exposed project announcement
At c-base, Berlin, one of the first video of fbTREX in the wild, when the beta version was beginning
See also: Web slides
event Published in: 2016 -October
facebook.tracking.exposed (code show-off)
At the C-Base Hack'n'Tell, when Alberto won the monthly price, our new web-extension was released!
event Published in: 2016 - October
facebook.tracking.exposed (project pitch)
At PyData the very first presentation of Alberto, when he started to develop the new web-extension
Authors: Alberto Granzotto
event Published in: 2016 - September Language: Italian
Cyber Resistance in 2016 consist in doing algorithm reversing!
This is the project inception to the public! The original title was complete by saying 'not encryption anymore', but might sounds misleading. Encryption is a fundamental element for protection, simply, the impact of social media in our perception of reality is unmeasured, subtle, and potentially scaring. But is not for fear this call, is because, with centralization, we lose as individual the ability to control our own algorithm. P.S. Despite this is first appearance of the project in public, the very first birthday was here: https://moca.olografix.org/en/moca-en/ !
Authors: Claudio Agosti