News Horizon Europe project

The University of Urbino and more specifically Mapping Italian News team is a member of the consortium that was successfully funded by the Horizon Europe program for the project. The goal of the project is to develop novel AI- and network science-based methods that assist verification professionals throughout the complete content verification workflow.

More specifically MINE team will be involved in the activities aimed at discovery, tracking, and impact measurement of disinformation narratives and campaigns across social platforms, modalities, and languages, through integrated AI and network science methods. is co-funded by the European Commission under grant agreement ID 101070093, and the UK and Swiss authorities.

Italian snap election

Given the upcoming snap election, we updated our map and started monitoring coordinated accounts in our lists. Election related activities are described in a dedicated website:

Global Coordinated Link Sharing Behaviour Map (2022 edition)

An updated version of our CLSB global map is now available. Additionally, we have also released the lists of coordinated accounts and networks detected on GitHub.

Using the Meta's URL Shares Dataset, a collection of URLs shared on the platform between January 2017 and December 2021 with respective aggregated engagement metrics, user feedback and 3rd-party fact-checkers rating, we obtained a global list of 25,870 news stories (original URLs) rated as problematic ("false", "missing context", "mixture or false headline" or "missing context") by Facebook's third-party fact-checkers (see "Third Party Fact-Checker Ratings and Precedence Rules Explained" section of the URLs Data Set documentation).

CooRnet and NewsGuard

CooRnet, the r library that detects coordinated link sharing behaviour on Facebook and Instagram, NewsGuard scores. The new feature allows you to sort detected coordinated networks based on the average reliability score of the domains circulated. See the updated API key section at

Outputs of the "Mapping Coordinated Networks That Circulate Problematic Health Content in India and Nigeria" are now available!


Despite the widespread global concerns on the potential detrimental effects of misinformation on democracy, the vast majority of studies still focus on Western countries. As a result, we disproportionately know more about wealthy countries characterized by lasting democratic traditions and pluralistic media systems than what we know about contexts where these institutions are yonder and more fragile. This works contribute to filling this gap by applying to the cases of Nigeria and India an approach to map and study networks of coordinated social media accounts that spread problematic health-related content on Facebook and Instagram.

The study is part of the Health Discourse Research Initiative and was funded by the grant GATES INV-004543-02 issued by the Media Ecosystems Analysis Group.

Nigeria working paper download | Full Nigeria report download | Full India report download

New report on coordinated link sharing networks in the lead-up to German elections


In this study, we analyze organic and paid social media communication from political candidates, parties, and other social media users, in the lead-up to the 2021 German federal election. We document the investment in Facebook advertising of the main parties and their targeting strategies, the engagement they reached organically, and the activity of coordinated networks of Facebook pages and groups dealing with the election.

The study was supported by the Media Authority of North Rhine-Westphalia.

Full report download

MINE-FACTS report is now out!


This report presents the outcomes of a project aimed at developing and testing a prototype tool that supports and speeds-up the work of fact-checkers and de-bunkers by surfacing and ranking potentially problematic information circulated on social media with a content-agnostic approach. The tool itself is the result of a multi-year research activity carried on within the Mapping Italian News Research Program of the University of Urbino Carlo Bo to study the strategies, tactics and goals of influence operations aimed at manipulating the Italian public opinion by exploiting the vulnerabilities of the contemporary media ecosystem. This research activity led to developing original studies, public reports, new methods, maps and tools employed to study the activity of Italian nefarious social media actors aimed at amplifying the reach and impact of problematic information by coordinating their efforts. Tracking these actors proved instrumental to observe the “infodemic” unraveling during the early days of COVID-19 outbreak in Italy. Combining this existing knowledge with a range of original tools and data sources provided by Meta’s Facebook Open Research Initiative (Fort) and by The International Fact-Checking Network (IFCN) at Poynter, the report: documents those early days by highlighting a list of widely viewed and interacted links circulated on Facebook; traces the establishment, growth and evolution of Italian covid-skeptic coordinated networks on Facebook; presents a comprehensive and updated map of the activities performed by these networks of nefarious social media actors; unveils a set of original tactics and strategies employed by these actors to adjust their operations to the mitigation efforts adopted by social media platforms to reduce the spread of problematic information; describes the circulation of three specific piece of problematic information; provides an overview of the outcomes of the testing phase (carried out in collaboration with of a prototype tool that surfaces and ranks potentially problematic information circulated on social media with a content-agnostic approach.

Full report download | Scarica la sintesi del report in Italiano

Two open post-doc positions in the team

We are seeking two young researchers (preferably with PhD) to fill two one-year post doc position in the MINE team on the following projects:

Jobs description

Analysis of the circulation of COVID-19 false news on Facebook

The spread of the coronavirus around the world has been accompanied by an increase in the circulation and dissemination of rumors and conspiracy theories. Using IFCN's CoronaVirusFacts dataset, which collects over 8000 news items from 70 countries and 40 different languages that have been fact-checked and found to be false or misleading, the research intends to study the circulation of this content on Facebook. Using CrowdTangle and the rstat {CooRnet} package, the project aims at building a detailed map of the accounts, and of the relationships between them, which have contributed to producing and disseminating problematic information on COVID-19. In light of the characteristics of the research, the ideal candidate has previous skills in social media analysis, descriptive and inferential statistics and social network analysis. Particular consideration will also be given to the candidate's ability to combine quantitative and qualitative methodological approaches.

Mapping Coordinated Networks That Circulate Problematic Health in India and Nigeria

As part of a broader project entitled "Research collaboration on mis / dis information around global health", the research aims to identify and analyze the account networks that have disseminated problematic content on health issues in a coordinated manner in India and Nigeria. Using CrowdTangle and the rstat {CooRnet} package, the study aims at building a detailed map of the accounts, and of the relationships between them. Knowledge about the context of the reference countries will be provided by local project patterns. In light of the characteristics of the research, the ideal candidate has previous skills in social media analysis, descriptive and inferential statistics and social network analysis. Particular consideration will also be given to the candidate's ability to combine quantitative and qualitative methodological approaches.

Type of appointment

These appointments will be offered on a fixed-term, full-time basis for one year (1).


University of Urbino, Urbino, Italy.

Selection Criteria

Completion of a PhD in a relevant discipline area (non mandatory), and demonstrated alignment of research interests and experience with one of the following research areas: media and communication studies, political science, social media analytics, computational social science, computational discourse analysis, social network analysis, issue mapping.

Remuneration and Benefits

The gross salary will be of € 25,000 per year, including social security contributions but tax-free (the net salary will be around € 20,000).

How to apply

First create your account, if you don't have one, to get your personal profile on the national website used in Italy to apply to open academic positions. You can then apply online at (click manage your application button). The application allow you to apply to all the post-doc positions currently open. Please look for the correct project title to apply for the right positions.

Application deadline: 23-12-2021 23:59.


Please address any question to

Most widely viewed links on Facebook during 2020

With the goal of contributing to the ongoing debate on widely viewed external links of Facebook, we relied on the latest version of the URL Shares Dataset to address the following question: what were the most viewed URLs shared on Facebook during 2020 in US, UK and 27 EU countries?

CooRnet and the A-B-C Cycle

One year ago we pushed the first git commits. Happy 1st Birthday {{CooRnet}} 🎂.

To celebrate, we have just published a new document that illustrates how CooRnet implement the A-B-C Framework.

Third report in the mine_smd covid series

Mine_smd Covid is a series of reports that unveil and describe networks and strategies used by Italian social media actors to exploit the loophole of social media platforms and maximize the distribution of their problematic content during the pandemic. All the cases have been detected in the context of a larger study on the spread of Covid-19 related mis/disinformation in Italy.

In this third report, we describe a network of In the context of a larger study on the spread of Covid-19 related mis/disinformation in Italy, we detected a network of 10 Facebook Pages that performed Coordinated Link Sharing. The potential reach of the network is significant, with a cumulative subscriber count close to 2 millions users. Each month, the network publishes more than 6,500 posts. The large majority of posts are links (83%), followed by status (8%) and photos (7%). However, 8 over 10 photos also include links in the message/description of the post. The current goal of the network is to drive traffic to the domain, a news source that, according to NewsGuard, fails to meet all the basic journalistic standards, is anonymous and publishes false news about health and partisan right-wing stories without disclosing their editorial line.The network has a long history of activity. It was spotted as performing coordinated link sharing behaviour on highly polarized and false political content in the lead-up to 2018 and 2019 Italian elections (see elections report). The report reconstructs the full list of different domains used by this network from 2017 on and points out to a brand new domain currently used ( Several of the domains shared by this network are featured in fact-checkers black-lists. 35 news stories posted by this network have been rated as false or misleading by Facebook’s Italian third-party fact-checker. Despite this, the posts linking to these 35 stories have been cumulatively viewed over six million times and clicked more than five hundred thousand times between 2017 and 2019 on Facebook. The domain was also shared by this network. An analysis of the core Facebook network that shares this domain is available in the second report of this series. The network also serves as a paradigmatic example of how memes page can be repurposed to share highly problematic content as the content posted shifted suddenly from photos to links in the months preceding 2018 Italian general election. A diachronic analysis of the networks’ coordinated links sharing activities, clearly highlights an escalation of the operation that started immediately after the fall of the “yellow-green” government supported by a coalition of M5S and League. The activity further intensifies from the covid-lockdown on.

Second report in the mine_smd covid series

Mine_smd Covid is a series of reports that unveil and describe networks and strategies used by Italian social media actors to exploit the loophole of social media platforms and maximize the distribution of their problematic content during the pandemic. All the cases have been detected in the context of a larger study on the spread of Covid-19 related mis/disinformation in Italy.

In this second report, we describe a network of 26 Facebook Pages that performed Coordinated Link Sharing. The potential reach of the network is significant, with a cumulative subscriber count close to 6 million users. Each month, the network publishes more than 18,000 posts. Half of the posts are type photos (64%), followed by status (22%) and links (14%). However, one third of the photos include links in the message/description of the post. The goal of the network is to drive traffic to the domain, a news source that, according to NewsGuard, fails to meet several basic journalistic standards and republished articles from other media without mentioning the original source. The network is organized in different clusters that, beside various forms of links pointing to the main domain, also post the same image macros at approximately the same time. These types of posts tend to perform better in terms of volume of interaction received. Beside the obvious economic driver, one specific cluster also appears to be ideologically motivated. Over the year, the network experimented with different strategies aimed at maximizing the exposure of their content and, possibly, sidestepping Facebook’s community standard. Starting in March, a growing number of the links shared are posted in the first comment of click-bait posts (either photos or status). More recently, the network also started posting links to well-respected journalistic news sources such as La Repubblica and La Stampa.

First report in the mine_smd covid series

Mine_smd Covid is a series of reports that unveil and describe networks and strategies used by Italian social media actors to exploit the loophole of social media platforms and maximize the distribution of their problematic content during the pandemic. All the cases have been detected in the context of a larger study on the spread of Covid-19 related mis/disinformation in Italy.

In this first report, we describe a network of 34 Facebook Pages that performed Coordinated Link Sharing. The potential reach of the network is significant with a cumulative subscriber count of over 6 million users. Each month, the network publishes more than 20,000 posts.

The large majority of content posted consists of inspirational sentences conveyed as image macros. However, 10% of the photo includes links in the message/description of the post. During mid-October, the network started posting links to well-respected journalistic news sources such as La Repubblica and La Stampa.

Presentation at #AoIR2020 - July 22 2020

Paper presented at #AoIR2020 - 21st Annual Conference of the Association of Internet Researchers (28-31 October 2020)


Social media, as many scholars have shown, can be used to influence political behavior through coordinated disinformation campaigns in which participants pretend to be ordinary citizens. With a specific reference to Facebook, a recent study has spotlighted patterns of coordinated activity aimed at fueling online circulation of specific news stories before the 2018 and 2019 Italian elections, an activity called by the authors “Coordinated Link Sharing Behavior” (CLSB). More precisely, CLSB refers to the coordinated shares of the same news articles in a very short time by networks of entities composed by Facebook pages, groups and verified public profiles. The uncertainty related to the coronavirus outbreak is a unique chance for malicious actors to leverage the anxiety of online publics to reach their goals, filling the information void with problematic content. Considering the association between coordination, media manipulation, and problematic information, the entities involved in coordinated online activities represent a privileged perspective on these phenomena. Thus, against the backdrop of the literature and the already conducted studies, this proposal will discuss Coordinated Link Sharing Behavior in the context of the coronavirus outbreak informational void, analyzing network mutations over time and the content strategies used to exploit the ambiguity associated with the topic.

Extended Abstract:

Presentation at #SMSociety - July 22 2020

Coordinated Link Sharing Behavior as a Signal to Surface Sources of Problematic Information on Facebook

by Fabio Giglietto, Nicola Righetti, Luca Rossi and Giada Marino

Despite widespread concern over the role played by disinformation during recent electoral processes, the intrinsic elusiveness of the subject hinders efforts aimed at estimating its prevalence and effect. While there has been proliferation of attempts to define, understand and fight the spread of problematic information in contemporary media ecosystems, most of these attempts focus on detecting false content and/or bad actors. For instance, several existing studies rely on lists of problematic content or news media sources compiled by fact-checkers. However, these lists may quickly become obsolete leading to unreliable estimates. Using media manipulation as a frame, along with a revised version of the “coordinated inauthentic behavior” original definition, in this paper, we argue for a wider ecological focus. Leveraging a method designed to detect “coordinated links sharing behavior” (CLSB), we introduce and assess an approach aimed at creating and keeping lists of potentially problematic sources updated by analyzing the URLs shared on Facebook by public groups, pages, and verified profiles. We show how CLSB is consistently associated with higher risks of encountering problematic news sources across three different datasets of news stories and can be thus used as a signal to support manual and automatic detection of problematic information.

Note: The video will be available until July 31, 2021

Full paper:

Introducing CooRnet: an r package to detect coordinated link sharing behaviour

Today we are releasing our r package to detect coordinated link sharing behavior (CLSB) on Facebook and Twitter. All you need is a list of URLs and a CrowdTangle API key and you are ready to go! Discover CooRnet at

Today, we officially present it at the first CrowdTangle Researchers x Journalists CT Meetup #1: The Covid Edition, a meetup involving a new working group where reporters, academics and fact-checkers from Europe and Africa share methodologies and tools for investigating social media.

Researchers x Journalists CT

New Peer reviewed publication: "It takes a village to manipulate the media"

Giglietto, F., Righetti, N., Rossi, L., & Marino, G. (2020). It takes a village to manipulate the media: coordinated link sharing behavior during 2018 and 2019 Italian elections. Information, Communication and Society, 1–25.


Over the last few years, a proliferation of attempts to define, understand and fight the spread of problematic information in contemporary media ecosystems emerged. Most of these attempts focus on false content and/or bad actors detection. In this paper, we argue for a wider ecological focus. Using the frame of media manipulation and a revised version of the ‘coordinated inauthentic behavior’ original definition, the paper presents a study based on an unprecedented combination of Facebook data, accessed through the CrowdTangle API, and two datasets of Italian political news stories published in the run-up to the 2018 Italian general election and 2019 European election. By focusing on actors’ collective behavior, we identified several networks of pages, groups, and verified public profiles (‘entities’), that shared the same political news articles on Facebook within a very short period of time. Some entities in our networks were openly political, while others, despite sharing political content too, deceptively presented themselves as entertainment venues. The proportion of inauthentic entities in a network affects the wideness of the range of news media sources they shared, thus pointing to different strategies and possible motivations. The paper has both theoretical and empirical implications: it frames the concept of ‘coordinated inauthentic behavior’ in existing literature, introduces a method to detect coordinated link sharing behavior and points out different strategies and methods employed by networks of actors willing to manipulate the media and public opinion.


2nd EXPLORING MEDIA ECOSYSTEMS CONFERENCE (Samberg Conference Center at MIT Cambridge, Massachusetts)

Il 2 e 3 di Marzo 2020 si è svolta la seconda edizione della conferenza Exploring Media Ecosystems. Siamo stati invitati a presentare gli ultimi sviluppi del nostro lavoro. Di seguito l'abstract del nostro intervento e nel riquadro a destra la relativa presentazione.

Exploring Media Ecosystems Through Partisan Media Attention: Lessons Learned from Two Elections Held in the European Laboratory of Social Media Populisms

During the last decade, Italy proved to be a fertile ground for the rise of digital parties (Gerbaudo, 2018) and social media driven populisms (Engesser et al., 2017). The Italian media system is characterized by a longstanding established media partisanship and political parallelism (Hallin & Mancini, 2004). Online news outlets rival TV as the main source of information and Facebook is reportedly used to get news by 54% of the population (Newman et al., 2019). Additionally, Bradshaw and Howard observed evidence of organized social media manipulation (2019).

Against this backdrop, the project Mapping Italian News Media Political Coverage in the Lead-up to 2018 General Election (MINE) was designed to explore the Italian media ecosystem by analyzing a set of election-related political news stories and their patterns of engagement on Facebook and Twitter. Leveraging on a method originally developed for the 2016 US Presidential Election (Benkler et al., 2018), we estimated the political leaning and insularity (the degree by which they are exclusively shared by online actors affiliated to a specific party) of 634 news media sources (2019). We found that supporters of populist parties tend to rely on more insular media sources and detected a specific pattern of interaction on Facebook (comments/shares ratio) significantly affected by the insularity of the source and by the sentiment toward different political actors (as expressed in the title/blurb of the news items). The evidence suggested that this pattern of interaction could be associated with attempts to manipulate the information environment (2019).

To deepen the question we designed a follow up to the study. Our research, still in progress, is documenting further strategies used by partisan online communities to amplify the social media reach of certain political news stories and skew the public narrative around certain issues by exploiting the vulnerabilities of social media platforms. Popular content, in fact, tends to spread faster on social media due to the effect of algorithms that prioritize better-performing links, images, videos, and posts. These performances depend on an estimate of popularity based on the analysis of quantified attention metrics provided by each social media platform (likes, reactions, views, shares, etc). The centrality of these metrics offers big rewards to those interested in increasing the visibility of certain content (Zhang et al., 2017). For these reasons, different actors may attempt to coordinate their efforts to get the initial spin which, once detected by the algorithm, may ignite the propagation machine and even attract the attention of mainstream media (Phillips, 2018). This is not at all a new phenomenon. Fans’ attempts to coordinate their behavior to push certain hashtags into Twitter trending topics date back to 2011 at least (Boyd, 2017). During the last few years, we have increasingly observed similar practices employed with the aim of enhancing the spread of political news stories.

By using two datasets of political news stories collected in the six months preceding the 2018 general election and the 2019 European elections we shed some light on these practices in the context of Italian politics. By looking at the news stories shared by Facebook and Instagram accounts, pages and public groups we identified several networks performing what we call “Coordinated Link Sharing Behavior”, i.e. the practice of repeatedly sharing the same links within a very short period of time. Both in 2018 and 2019, news stories shared by these networks of coordinated actors received a higher volume of Facebook engagement when compared with other stories, and were consistently associated with a higher percentage of sources blacklisted by fact checkers as problematic. Moreover, we found that these coordinated networks included many Facebook pages already signaled for spreading problematic information (2019).

Besides presenting these results and the methodology used to estimate media partisan attention in a multi-party context and detect coordinated links sharing behavior, we discuss the implications of comparing different media ecosystems and outline paths for future studies.


Benkler, Y., Faris, R., & Roberts, H. (2018). Network propaganda : manipulation, disinformation, and radicalization in American politics.

Boyd, D. (2017). Hacking the attention economy. Data and Society: Points. Available at: Https://points. Datasociety. Net/hacking-the-Attention-Economy-9fa1daca7a37.

Bradshaw, S., & Howard, P. (2019). The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation (Vol. 31). Oxford Internet Institute.

Engesser, S., Ernst, N., Esser, F., & Büchel, F. (2017). Populism and social media: how politicians spread a fragmented ideology. Information, Communication and Society, 20(8), 1109–1126.

Gerbaudo, P. (2018). The Digital Party: Political Organisation and Online Democracy (1 edition). Pluto Press.

Giglietto, F., Righetti, N., & Marino, G. (2019). Understanding Coordinated and Inauthentic Link Sharing Behavior on Facebook in the Run-up to 2018 General Election and 2019 European Election in Italy.

Giglietto, F., Righetti, N., Marino, G., & Rossi, L. (2019). Multi-party media partisanship attention score. Estimating partisan attention of news media sources using Twitter data in the lead-up to 2018 Italian election. Comunicazione politica, 20(1), 85–108.

Giglietto, F., Valeriani, A., Righetti, N., & Marino, G. (2019). Diverging patterns of interaction around news on social media: insularity and partisanship during the 2018 Italian election campaign. Information, Communication and Society, 22(11), 1610–1629.

Hallin, D. C., & Mancini, P. (2004). Comparing Media Systems: Three Models of Media and Politics. Cambridge University Press.

Newman, N., Fletcher, R., Kalogeropoulos, A., & Nielsen, R. (2019). Reuters institute digital news report 2019 (Vol. 2019). Reuters Institute for the Study of Journalism.

Phillips, W. (2018). The Oxygen of Amplification. Better Practices for Reporting on Far Right Extremists, Antagonists, and Manipulators. Data & Society Research Institute.

Zhang, Y., Wells, C., Wang, S., & Rohe, K. (2017). Attention and amplification in the hybrid media system: The composition and activity of Donald Trump’s Twitter following during the 2016 presidential election. New Media & Society, 1461444817744390.

Exploring media ecosystem in Italy

Facebook Full URLs Shares Dataset adesso disponibile per i ricercatori grazie a Social Science One

A conclusione di un lavoro durato 20 mesi, Social Science One e Facebook hanno finalizzato e reso disponibile ad alcuni team selezionati una delle più grandi collezioni di dati mai realizzata e condivisa con i ricercatori di scienze sociali.

Il dataset contiene informazioni relative a circa 38 milioni di URL (link) condivise pubblicamente su Facebook a partire da gennaio 2017 e fino al 31 luglio 2019. Per ogni URL è possibile conoscere il numero di segnalazioni che essa ha ricevuto come notizia falsa, incitamento all’odio o spam, se è stata sottoposta a fact-checking e, in caso affermativo, il relativo rating ricevuto. Il dataset contiene inoltre i dati aggregati riguardanti la tipologia di persone che hanno visualizzato, condiviso, interagito attraverso un like o altra reazione e condiviso senza aver prima cliccato il link.

Grazie a questo set di informazioni a cui non era possibile accedere in precedenza, il dataset consente dunque agli scienziati sociali di affrontare alcune fra le più pressanti questioni del nostro tempo in relazione agli effetti dei social media sulla democrazia e sulle elezioni.

Le URL sono etichettate per Paese nel quale sono state prevalentemente condivise. Circa 1,5 milioni di quelle presenti nel dataset sono state condivise prevalentemente in Italia. Nel dataset sono inoltre presenti circa 15 milioni di record di dati aggregati riferiti ad azioni compiute su Facebook da utenti italiani.

La privacy degli utenti è garantita da molteplici fattori: oltre a fornire i dati in forma aggregata e limitatamente alle URL condivise in pubblico almeno cento volte, l’identità del singolo utente è resa virtualmente irrintracciabile dall’applicazione del regime della “differential privacy” all’URL Shares Dataset. L’etichetta differential privacy descrive un set di tecnologie che, introducendo rumore statistico nei dati, mirano a prevenire ogni possibilità di ri-identificazione di un singolo individuo le cui azioni possono essere rappresentate nel dataset. Questa tecnologia si sta progressivamente affermando come uno standard nella condivisione di dati, tanto che recentemente anche lo U.S. Census Bureau, la principale agenzia nazionale di statistica negli Stati Uniti, ha annunciato che condividerà dati con il pubblico attraverso questa soluzione tecnologica.

Oltre a questo, già da qualche tempo Social Science One ha reso disponibile l’accesso a CrowdTangle e alle API della Ad Library, strumenti con cui i vari team di ricercatori hanno iniziato a familiarizzare, producendo anche alcuni insight, report e articoli scientifici.

Grazie al finanziamento ricevuto da Social Science Research Council nell’ambito Social Media & Democracy Initiative, un team di ricerca dell’Università di Urbino Carlo Bo guidato da Fabio Giglietto è il primo in Italia ad avere accesso a questi dati e ai relativi strumenti.

Partendo dai risultati del progetto Mapping Italian News Media Political Coverage in the Lead-up to 2018 General Election (MINE) che ha documentato tentativi di amplificare certe notizie politiche con lo scopo di massimizzarne la diffusione e polarizzare i pubblici su alcune tematiche, lo scopo del progetto di ricerca è di approfondire la conoscenza che abbiamo su questi tentativi, utilizzando il rapporto tra le metriche di Facebook e l’insularità delle fonti, ossia il grado con cui una fonte riceve attenzione dalla comunità online vicina ad un certo partito politico.

Sul sito web dedicato al progetto Mapping Italian News - Social Media & Democracy sono già disponibili alcuni dei lavori di ricerca condotti attraverso l’utilizzo di CrowdTangle. Questo strumento ha permesso di osservare l’amplificazione coordinata di fonti problematiche su Facebook descritta nel report dal titolo “Understanding coordinated and inauthentic link sharing behavior on facebook in the run-up to 2018 general election and 2019 European election in Italy”.

Il dataset rilasciato oggi consente di osservare come mai prima d’ora le dinamiche di esposizione e interazione con le notizie nella piattaforma Facebook ed il loro effetto sulla comunicazione politica, propaganda e formazione del consenso.

In questo contesto, nel corso dei prossimi mesi, i policy maker ed il pubblico Italiano potranno conoscere la portata e l’efficacia dell’iniziativa varata da Facebook per contrastare il diffondersi di notizie false in partnership con fact-checkers esterni, comprendere l’intensità e le dinamiche che guidano le segnalazioni degli utenti, conoscere l’influenza di specifiche caratteristiche delle notizie e delle fonti di informazione online sulla diffusione e modalità di interazione degli utenti di Facebook con queste notizie.

Questo dataset sarà reso disponibile anche ad altri che gruppi di ricercatori che invieranno una proposta a Social Science One.

Convegno dell’Associazione Italiana di Comunicazione Politica 2019: 12-14 December 2019, Università di Milano

Il 13 dicembre 2019 abbiamo avuto modo di partecipare al convegno dell'Associazione Italiana di Comunicazione Politica. In questa occasione abbiamo presentato un lavoro dal titolo Coordinated and Inauthentic Link Sharing on Facebook in the Run-up to the 2018 General Election and 2019 European Election in Italy di cui potete consultare le slide.

Compol 2019

Comprendere il comportamento coordinato su Facebook fra ideologia e monetizzazione

Gli studi sulla diffusione della disinformazione online sono spesso basati sull’identificazione di fonti, contenuti e attori problematici, come ad esempio siti di “fake news”, bot e account fake. Molti recenti studi mostrano, tuttavia, che affrontare il problema della disinformazione basandosi su questi elementi è particolarmente difficile dal punto di vista tecnico ed epistemologico. Per esempio, la distinzione tra falso e vero non è sempre così netta, i bot sono complessi da identificare e i siti di disinformazione appaiono, scompaiono e cambiano nome continuamente.

Per ovviare a queste difficoltà, stiamo sperimentando da alcuni mesi un approccio centrato sulle azioni, più che sui contenuti o gli attori. Accogliendo le suggestioni del concetto di Coordinated Inauthentic Behavior impiegato da Facebook nella lotta contro network che veicolano contenuti ingannevoli sotto mentite spoglie, e i dati della piattaforma CrowdTangle messa a disposizione nell’ambito del Social Media and Democracy Research Grant, abbiamo sviluppato una metodologia che identifica automaticamente le reti di pagine, gruppi pubblici e account verificati (cui ci riferiamo con il termine “entità”) che hanno agito in modo coordinato per condividere lo stesso link in un brevissimo arco temporale.

“Coordinated Link Sharing Behavior”: amplificazione coordinata di fonti (problematiche) su Facebook

Abbiamo definito l’attività coordinata di condivisione di link su Facebook come “Coordinated Link Sharing Behavior”. Questa attività può essere strategicamente impiegata per ingannare l’algoritmo di Facebook per incrementare artificialmente la popolarità iniziale dei contenuti e innescare l’amplificazione algoritmica che riguarda i contenuti più popolari.

Attraverso un metodo che abbiamo sviluppato e reso disponibile, abbiamo identificato 24 network (82 entità) attivi nei sei mesi precedenti le elezioni politiche del 2018 e 92 network (606 entità) attivi durante i sei mesi precedenti le elezioni italiane per il parlamento Europeo del 2019. I dettagli del metodo e dei risultati ottenuti sono disponibili in questo report.

Report - sintesi - IT - Pubblico

Uniurb a Menlo Park per studiare il ruolo di Facebook durante le elezioni italiane

Il 5 e il 6 giugno un team di ricerca del Dipartimento di Scienze della Comunicazione, Studi Umanistici e Internazionali dell'Università di Urbino coordinato dal prof. Fabio Giglietto ha partecipato presso il building 29 del campus di Facebook a Menlo Park (California) al primo meeting dei 12 gruppi di ricerca scelti da Social Science Research Council per studiare il ruolo della piattaforma di social media nei processi elettorali.

Il meeting ha visto la partecipazione di alcuni fra i più importanti studiosi internazionali di comunicazione politica, media e computer science – da Gary King e Robert Faris della Harvard University a Joshua Tucker della New York University fino a Duncan Watts di Microsoft Research per citarne solo alcuni – che hanno avuto l’opportunità di testare per la prima volta al mondo alcuni nuovi strumenti progettati dai ricercatori e dagli ingegneri di Facebook per consentire ai team di ricerca selezionati di accedere in modo sicuro ad alcuni dataset organizzati per ottenere risultati affidabili che preservino al tempo stesso la privacy degli utenti.

Nel corso di un evento che per molti versi può essere considerato storico e a cui ha preso parte anche Elliot Schrage, vice presidente del settore Comunicazione e Relazioni Pubbliche di Facebook, un team di scienziati, ingegneri e avvocati ha infatti presentato un innovativo sistema basato su una metodologia chiamata “differential privacy” che consentirà ai ricercatori di accedere e analizzare i dati resi disponibili preservando in modo scientifico la privacy degli utenti tanto agli occhi dei ricercatori quanto a quelli di eventuali malintenzionati che intendessero risalire ad informazioni sensibili degli utenti. Questa metodologia prevede l’inserimento di diversi gradi di mascheramento dei dati a seconda di quanto si approfondisce l’analisi eliminando ogni possibilità di identificare un caso specifico all’interno del dataset.

Oltre che un’occasione di incontro e di scambio di idee tra vari team di ricerca, l’evento è stato anche un momento per confrontarsi con strumenti innovativi per l’analisi dei dati che aprono nuove opportunità e sfide per la ricerca sociale e la collaborazione fra ricercatori e aziende private.

A partire dai risultati ottenuti dal progetto “Mapping Italian News Media Political Coverage in the Lead-up to 2018 General Election”, già finanziato nel 2017 da Open Society Foundation, e grazie al nuovo finanziamento di Social Science Research Council e ai dati resi disponibili da Facebook, il team di Urbino guidato dal prof. Giglietto studierà le dinamiche di condivisione delle notizie vere e false pubblicate durante le elezioni politiche italiane del 2018 e quelle Europee del 2019.

Leggi il comunicato sul sito di UNIURB.

First Grants Announced for Independent Research on Social Media’s Impact on Democracy Using Facebook Data