# Realistic Aspects of Simulation Models for Fake News Epidemics over Social Networks

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

- We propose an agent-based simulation model, where different agents act independently to share news over a social network. We consider two adversary spreading processes, one for the fake news and a second one for debunking.
- We differentiate among agents’ roles and characteristics, by distinguishing among common users, influencers, and bots. Every agent is characterized by its own set of parameters, chosen according to statistical distributions in order to create heterogeneity;
- We propose a network construction algorithm that favors the emergence both of geographical user clusters and of clusters of users with common interests; this improves the degree of realism of the final simulated social network connections; moreover, we account for the presence of influencers and bots and for unequal trust among agents;
- We propose ways to add time dynamical effects in fake news epidemic models, by modeling the connection time of each agent to the social network and the decay of the population’s interest on a news item over time.

## 2. Related Work

#### 2.1. Fake News and Bot Detection

#### 2.2. Epidemic Modeling

## 3. Model

#### 3.1. Agent Modeling

#### 3.2. Network Construction

- Geographical proximity: A user is likely to be connected with another user if they live nearby, e.g., in the same city. The coordinates of a node are assigned by sampling from a uniform distribution in the square ${[0,1]}^{2}$. This procedure follows the random geometric graph formalism [51] and ensures that geographical clusters appear.We evaluate the distance between two nodes using the Euclidean distance formula, normalized such that the maximum possible distance is 1. In our simulations, we created an edge between two nodes if their attribute distance was less than a threshold equal to $0.03$.
- Attributes’ proximity: Each node has a set of five “interest” attributes, distributed according to a truncated Gaussian distribution; we employ these parameters to model connections between agents based on their interests. This helps create connections and clusters in the attribute domain, rather than limiting connections to geographical proximity criteria.The distance between two sets of interest attributes is evaluated using the Euclidean distance formula, normalized such that the maximum possible distance is 1. In our simulations, an edge between two nodes is created if their distance is less than a threshold equal to $0.03$.
- Randomness: To introduce some randomness in the network generation process, an edge satisfying the above geographical and attribute proximity criteria is removed from the graph with a given probability ${p}_{\mathit{rem}}=0.5$.

#### 3.3. Trust among Agents

#### 3.4. Time Dynamics

^{−1}). Conversely, in the baseline case without time dynamics, all nodes access the network at the same, a fixed rate of once every 16 minutes.

#### 3.5. Agent-Based Spreading Simulation

- Fake news spreading process: a node creates a fake news item and shares it over the network for the first time. The fake news starts to spread within a population of susceptible (S) agents. When a susceptible node gets “infected” in this process, it becomes a believer (B);
- Debunking process: a node does fact-checking. It becomes “immune” to the fake news item and rather starts spreading debunking messages within the population. When a node gets “infected” in this process, it becomes a fact-checker (FC).

Algorithm 1: Spreading of fake news and debunking. |

## 4. Results

#### 4.1. Roadmap

#### 4.2. Baseline Configuration

#### 4.3. Impact of Influencers and Bots

#### 4.4. Impact of Time Dynamics

#### 4.5. Impact of Weighted Networks Expressing Non-Uniform Node Trust

## 5. Discussion

## 6. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Perrin, A. Social Media Usage: 2005–2015. Pew Res. Cent. Int. Technol.
**2015**, 125, 52–68. [Google Scholar] - Albright, J. Welcome to the Era of Fake News. Media Commun.
**2017**, 5, 87–89. [Google Scholar] [CrossRef] - Shrivastava, G.; Kumar, P.; Ojha, R.P.; Srivastava, P.K.; Mohan, S.; Srivastava, G. Using Data Sciences in Digital Marketing: Framework, methods, andperformance metrics. J. Innov. Knowl.
**2020**, in press. [Google Scholar] - Bovet, A.; Makse, H.A. Influence of fake news in Twitter during the 2016 US presidential election. Nat. Commun.
**2019**, 10, 7. [Google Scholar] [CrossRef] [PubMed] - Saura, J.R.; Ribeiro-Soriano, D.; Palacios-Marqués, D. From user-generated data to data-driven innovation: A research agenda to understand user privacy in digital markets. Int. J. Inf. Manag.
**2021**, in press. [Google Scholar] [CrossRef] - Ribeiro-Navarrete, S.; Saura, J.R.; Palacios-Marqués, D. Towards a new era of mass data collection: Assessing pandemic surveillance technologies to preserve user privacy. Technol. Forecast. Soc. Chang.
**2021**, 167. in press. [Google Scholar] [CrossRef] - Lewandowsky, S.; Ecker, U.K.; Cook, J. Beyond Misinformation: Understanding and Coping with the “Post-Truth” Era. J. Appl. Res. Mem. Cogn.
**2017**, 6, 353–369. [Google Scholar] [CrossRef] [Green Version] - Aiello, L.M.; Deplano, M.; Schifanella, R.; Ruffo, G. People are Strange when you’re a Stranger: Impact and Influence of Bots on Social Networks. arXiv
**2014**, arXiv:1407.8134. [Google Scholar] - Tambuscio, M.; Ruffo, G.; Flammini, A.; Menczer, F. Fact-Checking Effect on Viral Hoaxes: A Model of Misinformation Spread in Social Networks. In Proceedings of the 24th International Conference on World Wide Web; Association for Computing Machinery: New York, NY, USA, 2015. [Google Scholar]
- Newman, M.E.J. Spread of epidemic disease on networks. Phys. Rev. E
**2002**, 66, 016128. [Google Scholar] [CrossRef] [Green Version] - Furini, M.; Mirri, S.; Montangero, M.; Prandi, C. Untangling between fake-news and truth in social media to understand the COVID-19 Coronavirus. In Proceedings of the 2020 IEEE Symposium on Computers and Communications (ISCC), Rennes, France, 7–10 July 2020; pp. 1–6. [Google Scholar]
- Fernandez, M.; Alani, H. Online Misinformation: Challenges and Future Directions. In Proceedings of the ACM WWW’18 Companion, Lyon, France, 23–27 April 2018. [Google Scholar]
- Bondielli, A.; Marcelloni, F. A Survey on Fake News and Rumour Detection Techniques. Inf. Sci.
**2019**, 497, 38–55. [Google Scholar] [CrossRef] - Zhou, X.; Zafarani, R. Fake News: A Survey of Research, Detection Methods, and Opportunities. arXiv
**2018**, arXiv:1812.00315. [Google Scholar] - Liu, Y.; Wu, Y.B. Early Detection of Fake News on Social Media Through Propagation Path Classification with Recurrent and Convolutional Networks. In Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018. [Google Scholar]
- Ma, J.; Gao, W.; Mitra, P.; Kwon, S.; Jansen, B.J.; Wong, K.F.; Cha, M. Detecting Rumors from Microblogs with Recurrent Neural Networks. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence; AAAI Press: Palo Alto, CA, USA, 2016; p. 38183824. [Google Scholar]
- Socher, R.; Lin, C.C.Y.; Ng, A.Y.; Manning, C.D. Parsing Natural Scenes and Natural Language with Recursive Neural Networks. In Proceedings of the 28th International Conference on International Conference on Machine Learning, Bellevue, WA, USA, 28 June–2 July 2011. [Google Scholar]
- Kaliyar, R.K.; Goswami, A.; Narang, P.; Sinha, S. FNDNet A deep convolutional neural network for fake news detection. Cogn. Syst. Res.
**2020**, 61, 32–44. [Google Scholar] [CrossRef] - Bronstein, M.M.; Bruna, J.; LeCun, Y.; Szlam, A.; Vandergheynst, P. Geometric Deep Learning: Going beyond Euclidean data. IEEE Signal Process. Mag.
**2017**, 34, 18–42. [Google Scholar] [CrossRef] [Green Version] - Monti, F.; Frasca, F.; Eynard, D.; Mannion, D.; Bronstein, M.M. Fake News Detection on Social Media using Geometric Deep Learning. arXiv
**2019**, arXiv:1902.06673. [Google Scholar] - Zhou, X.; Jain, A.; Phoha, V.V.; Zafarani, R. Fake News Early Detection: A Theory-driven Model. Digit. Threat. Res. Pract.
**2020**, 1, 1–25. [Google Scholar] [CrossRef] - Krouska, A.; Troussas, C.; Virvou, M. Comparative Evaluation of Algorithms for Sentiment Analysis over Social Networking Services. J. Univers. Comput. Sci.
**2017**, 23, 755–768. [Google Scholar] - Troussas, C.; Krouska, A.; Virvou, M. Evaluation of ensemble-based sentiment classifiers for Twitter data. In Proceedings of the 2016 7th International Conference on Information, Intelligence, Systems & Applications (IISA), Chalkidiki, Greece, 13–15 July 2016; pp. 1–6. [Google Scholar]
- Krouska, A.; Troussas, C.; Virvou, M. The effect of preprocessing techniques on Twitter sentiment analysis. In Proceedings of the IISA, Chalkidiki, Greece, 13–15 July 2016; pp. 1–5. [Google Scholar]
- Zuckerman, M.; DePaulo, B.M.; Rosenthal, R. Verbal and Nonverbal Communication of Deception. Adv. Exp. Soc. Psychol.
**1981**, 14, 1–59. [Google Scholar] - Abdullah-All-Tanvir; Mahir, E.M.; Akhter, S.; Huq, M.R. Detecting Fake News using Machine Learning and Deep Learning Algorithms. In Proceedings of the 2019 7th International Conference on Smart Computing Communications (ICSCC), Sarawak, Malaysia, 28–30 June 2019; pp. 1–5. [Google Scholar]
- Stella, M.; Ferrara, E.; De Domenico, M. Bots increase exposure to negative and inflammatory content in online social systems. Proc. Natl. Acad. Sci. USA
**2018**, 115, 1243512440. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Varol, O.; Ferrara, E.; Davis, C.A.; Menczer, F.; Flammini, A. The Online Human-Bot Interactions: Detection, Estimation, and Characterization. In Proceedings of the International AAAI Conference on Web and Social Media, Palo Alto, CA, USA, 25–28 June 2018. [Google Scholar]
- Yang, K.; Varol, O.; Davis, C.A.; Ferrara, E.; Flammini, A.; Menczer, F. Arming the public with artificial intelligence to counter social bots. Hum. Behav. Emerg. Technol.
**2019**, 1, 4861. [Google Scholar] [CrossRef] [Green Version] - Gilani, Z.; Kochmar, E.; Crowcroft, J. Classification of Twitter Accounts into Automated Agents and Human Users. In Proceedings of the 2017 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining 2017; Association for Computing Machinery: New York, NY, USA, 2017; p. 489496. [Google Scholar]
- Yang, K.C.; Varol, O.; Hui, P.M.; Menczer, F. Scalable and Generalizable Social Bot Detection through Data Selection. Proc. Aaai Conf. Artif. Intell.
**2020**, 34, 10961103. [Google Scholar] [CrossRef] - Kudugunta, S.; Ferrara, E. Deep neural networks for bot detection. Inf. Sci.
**2018**, 467, 312322. [Google Scholar] [CrossRef] [Green Version] - Davis, C.A.; Varol, O.; Ferrara, E.; Flammini, A.; Menczer, F. BotOrNot. In Proceedings of the 25th International Conference Companion on World Wide Web-WWW ’16 Companion, Cambridge, UK, 11–14 December 2016. [Google Scholar]
- Ferrara, E.; Varol, O.; Davis, C.; Menczer, F.; Flammini, A. The Rise of Social Bots. Commun. ACM
**2016**, 59, 96104. [Google Scholar] [CrossRef] [Green Version] - des Mesnards, N.G.; Hunter, D.S.; El Hjouji, Z.; Zaman, T. Detecting Bots and Assessing Their Impact in Social Networks. arXiv
**2018**, arXiv:1810.12398. [Google Scholar] - Shrivastava, G.; Kumar, P.; Ojha, R.P.; Srivastava, P.K.; Mohan, S.; Srivastava, G. Defensive Modeling of Fake News through Online Social Networks. IEEE Trans. Comput. Social Syst.
**2020**, 7, 1159–1167. [Google Scholar] [CrossRef] - Murayama, T.; Wakamiya, S.; Aramaki, E.; Kobayashi, R. Modeling and Predicting Fake News Spreading on Twitter. arXiv
**2020**, arXiv:2007.14059. [Google Scholar] - Tambuscio, M.; Oliveira, D.F.M.; Ciampaglia, G.L.; Ruffo, G. Network segregation in a model of misinformation and fact-checking. J. Comput. Soc. Sci.
**2018**, 1, 261–275. [Google Scholar] [CrossRef] [Green Version] - Tambuscio, M.; Ruffo, G. Fact-checking strategies to limit urban legends spreading in a segregated society. Appl. Netw. Sci.
**2019**, 4, 1–9. [Google Scholar] [CrossRef] [Green Version] - Burbach, L.; Halbach, P.; Ziefle, M.; Calero Valdez, A. Who Shares Fake News in Online Social Networks? In Proceedings of the ACM UMAP, Larnaca, Cyprus, 9–12 June 2019. [Google Scholar]
- Ross, B.; Pilz, L.; Cabrera, B.; Brachten, F.; Neubaum, G.; Stieglitz, S. Are social bots a real threat? An agent-based model of the spiral of silence to analyse the impact of manipulative actors in social networks. Eur. J. Inf. Syst.
**2019**, 28, 394–412. [Google Scholar] [CrossRef] - Törnberg, P. Echo chambers and viral misinformation: Modeling fake news as complex contagion. PLoS ONE
**2018**, 13, 1–21. [Google Scholar] [CrossRef] - Brainard, J.; Hunter, P.; Hall, I. An agent-based model about the effects of fake news on a norovirus outbreak. Rev. D’épidémiologie Santé Publique
**2020**, 68, 99–107. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Cisneros-Velarde, P.; Oliveira, D.F.M.; Chan, K.S. Spread and Control of Misinformation with Heterogeneous Agents. In Proceedings of the Complex Networks, Lisbon, Portugal, 10–12 December 2019. [Google Scholar]
- Norman, B.; Ann, M.J. Mapping and leveraging influencers in social media to shape corporate brand perceptions. Corp. Commun. Int. J.
**2011**, 16, 184–191. [Google Scholar] - Caldarelli, G.; Nicola, R.D.; Vigna, F.D.; Petrocchi, M.; Saracco, F. The role of bot squads in the political propaganda on Twitter. CoRR
**2019**, 3, 1–5. [Google Scholar] - Erdös, P.; Rényi, A. On Random Graphs I. Publ. Math. Debr.
**1959**, 6, 290. [Google Scholar] - Barabási, A.L.; Albert, R. Emergence of Scaling in Random Networks. Science
**1999**, 286, 509–512. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Watts, D.J.; Strogatz, S.H. Collective dynamics of ’small-world’ networks. Nature
**1998**, 393, 440–442. [Google Scholar] [CrossRef] - Wahid-Ul-Ashraf, A.; Budka, M.; Musial, K. Simulation and Augmentation of Social Networks for Building Deep Learning Models. arXiv
**2019**, arXiv:1905.09087. [Google Scholar] - Dall, J.; Christensen, M. Random geometric graphs. Phys. Rev. E
**2002**, 66, 016121. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Spricer, K.; Britton, T. An SIR epidemic on a weighted network. Netw. Sci.
**2019**, 7, 556580. [Google Scholar] [CrossRef] [Green Version] - Zhou, X.; Zafarani, R. A Survey of Fake News: Fundamental Theories, Detection Methods, and Opportunities. ACM Comput. Surv.
**2020**, 53, 1–40. [Google Scholar] [CrossRef] - Liu, S.Y.; Baronchelli, A.; Perra, N. Contagion dynamics in time-varying metapopulation networks. Phys. Rev. E
**2013**, 87, 032805. [Google Scholar] [CrossRef] [Green Version]

**Figure 1.**Example of a synthetic network generated through our algorithm. We depict common agents in blue, influencers in yellow, and bots in red. We observe the appearance both of geographical clusters involving nearby nodes and of interest-based clusters, joining groups of nodes located far from one another.

**Figure 2.**In-degree (

**left**) and out-degree (

**right**) distributions for the networks considered in the experiments. The presence of bots and influencers with a large degree distribution yields the second out-degree peak between degree values of 30 and 60.

**Figure 5.**SBFC plots (

**top row**), network infection graphs (

**middle row**), and network recovery graphs (

**bottom row**) for fake news spread in simulated online social networks (OSNs). (

**Left**) Baseline network. (

**Right**) Baseline network plus 30 influencers and 10 bots. In the middle row, the darker a node’s color is, the earlier that node starts believing in the fake news item. In the bottom row, the darker a node’s color is, the earlier that node performs fact-checking.

**Figure 6.**All-time maximum number of believers and fact-checkers over time with and without time dynamics.

**Figure 8.**SBFC plot, network infection, and recovery graphs for the same networks of Figure 5, but with the addition of eternal fact-checkers and weights on the connections.

**Table 1.**Summary of the contributions and missing aspects of the works in the literature that are most related to our agent-based fake news spreading simulation approach.

Ref. | Summary of the Contributions | Missing Aspects |
---|---|---|

[9] | Proposes a model for the competitive spreading of a fake news and its debunking. | No bots, homogeneous agents, no time dynamics. |

[21] | Uses supervised learning techniques to detect fake news at an early stage based on its content. | Diffusion dynamics are not among the targets of the study. |

[27] | Studies the role of the bots in social manipulation and in facilitating the emergence of polarization between communities, so as to exacerbate social conflict online. | Diffusion dynamics are not among the targets of the study. |

[38] | Studies the role of network structure in the spreading of fake news, assessing the emergence of segregation. | Same as [9]. |

[39] | Tests fact-checking strategies on different network topologies. | Same as [9]. |

[40] | Mixed-method study: proposes a questionnaire to capture the personality of users on a social network. In simulations, agents’ parameters are chosen according to the questionnaires. | No spreading of debunking messages, no time dynamics, no bots. |

[41] | Studies the influence of online bots on opinion formation on a social network and shows that bots control the formation of consensus. | Opinion dynamics setting, no time dynamics. |

[43] | Uses an agent-based model for the spreading of a physical disease and of information (both online and offline) related to it. The two processes are separate, but interact, showing that fake news spreading effects the diffusion of the disease. | No bots. |

[44] | Assesses the effects of heterogeneous agents on the competitive spreading of low- and high-quality information. Studies methods to mitigate the spreading of false information. | Simple agent interaction model, no time dynamics. |

[36] | Provides a mathematical formalization of a fake news spreading model and studies its dynamical properties. | Homogeneous agents, no competitive spread of the debunking, no bots. |

[37] | Proposes a model of the temporal evolution of the engagement of fake news items on Twitter. | Diffusion dynamics are not among the targets of the study. |

[42] | Models the formation of clustered communities echoing an opinion, and analyzes the community segregation that results. | Simple agent interaction model, no time dynamics, no bots. |

Name | Distribution | Description |
---|---|---|

State | $\{S,B,FC\}$ | State of the node |

Vulnerability | ${\mathcal{N}}_{0}^{1}(0.5,\phantom{\rule{0.166667em}{0ex}}0.{2}^{2})$ | Tendency to follow opinions |

Sharing rate | ${\mathcal{N}}_{0}^{1}(0.5,\phantom{\rule{0.166667em}{0ex}}0.{2}^{2})$ | Tendency to share the news |

Recovery rate | ${\mathcal{N}}_{0}^{1}(0.2,\phantom{\rule{0.166667em}{0ex}}0.{2}^{2})$ | Tendency to do fact-checking |

Interest attributes | ${\mathcal{N}}_{-1}^{1}(0,\phantom{\rule{0.166667em}{0ex}}0.{4}^{2})$ | Connect nodes based on their interests |

Coordinates | $\mathcal{U}(0,\phantom{\rule{0.166667em}{0ex}}1)$ | Geographical position of the node |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Lotito, Q.F.; Zanella, D.; Casari, P.
Realistic Aspects of Simulation Models for Fake News Epidemics over Social Networks. *Future Internet* **2021**, *13*, 76.
https://doi.org/10.3390/fi13030076

**AMA Style**

Lotito QF, Zanella D, Casari P.
Realistic Aspects of Simulation Models for Fake News Epidemics over Social Networks. *Future Internet*. 2021; 13(3):76.
https://doi.org/10.3390/fi13030076

**Chicago/Turabian Style**

Lotito, Quintino Francesco, Davide Zanella, and Paolo Casari.
2021. "Realistic Aspects of Simulation Models for Fake News Epidemics over Social Networks" *Future Internet* 13, no. 3: 76.
https://doi.org/10.3390/fi13030076