Home What makes audiences resilient to disinformation? Integrating micro, meso, and macro factors based on a systematic literature review
Article Open Access

What makes audiences resilient to disinformation? Integrating micro, meso, and macro factors based on a systematic literature review

  • Jülide Kont

    Jülide Kont is a Ph.D. candidate and lecturer at the Hanze University of Applied Sciences and the University of Groningen. Her research focuses on disinformation from a cross-national and comparative perspective.

    ORCID logo EMAIL logo
    , Wim Elving

    Wim J.L. Elving is a professor of applied sciences at EnTranCe, Centre of Expertise, Energy at Hanze University of Applied Sciences in Groningen. His research interests include CSR communication, branding, change and transitions, and sustainability communications.

    ORCID logo
    , Marcel Broersma

    Marcel Broersma is a professor of Media and Journalism Studies in The Centre for Media and Journalism Studies at the University of Groningen. His research interests include the interface between the digital transformation of journalism, social media, changing media use, and digital literacy and inclusion.

    ORCID logo
    and Çiğdem Bozdağ

    Çiğdem Bozdağ is an assistant professor at the Centre for Media and Journalism Studies at the University of Groningen. Her research interests include digital media use, digital literacy, digital inclusion, media education in schools, media and migration.

    ORCID logo
Published/Copyright: May 16, 2024

Abstract

Despite increased attention since 2015, there is little consensus on why audiences believe or share disinformation. In our study, we propose a shift in analytical perspective by applying the concept of resilience. Through a systematic literature review (n = 95), we identify factors that have been linked to individuals’ resilience and vulnerability to disinformation thus far. Our analysis reveals twelve factors: thinking styles, political ideology, worldview and beliefs, pathologies, knowledge, emotions, (social) media use, demographics, perceived control, trust, culture, and environment. By applying the results to the socio-ecological model (SEM), we provide a comprehensive view on what constitutes resilience to disinformation, delineate between different levels of influence, and identify relevant gaps in research. Our conceptualization contributes to an under-theorized field, in which the term resilience is much used yet rarely sufficiently defined.

1 Introduction

If disinformation had PR agents, they would be delighted. After all, the topic has been continuously sparking attention, controversy, and action in the past decade. The discourse often uses strong rhetoric—blaming social media, and referring to democracies at stake, free speech endangered, an unfolding infodemic, or information wars (Alvarez-Galvez et al., 2021; Bojic et al., 2023; Miller and Vaccari, 2020). Disinformation is considered harmful, demanding prevention and mitigation, although the degree of harmfulness is the subject of an ongoing debate (Bennett and Livingston, 2018; Meese et al., 2020). Fueled by the perceived threat and its pivotal role during Brexit, two U.S. elections and the storming of the U.S. Capitol, the Covid-19 pandemic, and, most recently, the war in Ukraine, the disinformation debate remains continuously high on the agenda of the press, politics, and public (Corbu et al., 2023; Marwick and Lewis, 2017).

The scientific community is no exception, as continuously rising numbers of publications on the topic indicate (Humprecht et al., 2020). Popular themes are political and health-related disinformation, fact-checking, disinformation correction or interventions, and introspective works focused on terminology (Abu Arqoub et al., 2022; Janmohamed et al., 2021; Kapantai et al., 2021).

Hovering above it all is the question of why people believe or share disinformation. Focusing on the ever-changing political, psychological, technological, sociological, and even neurological drivers of disinformation, previous studies have found a multitude of chains of causes and effects. They point to a decline of trust in media and politics, low levels of media literacy, different cognitive styles, motivated reasoning, and the deceiving characteristics of disinformation (Bryanov and Vziatysheva, 2021; Klebba and Winter, 2023; Sindermann et al., 2020). In most cases, the focus lies on singular factors. However, the complexity of the issue at hand demands more encompassing approaches.

We argue that applying the concept of resilience to disinformation offers a pathway to a more comprehensive understanding. The concept is applied in many disciplines, yet in the context of disinformation, research on resilience is practically still in its infancy. Thus, we ask:

RQ: Which factors are connected to resilience and vulnerability to disinformation in previous research?

We conducted a systematic literature review to analyze the increasingly convoluted landscape of disinformation research. Our standardized, protocol-driven methodology and rigorous search for all relevant literature allow for a reliable overview of research results on the topic. We first identify factors that have been linked to individuals’ resilience and vulnerability to disinformation. Unlike other (systematic) reviews in the field, we do not focus on one type or content of disinformation and draw our insights from a multidisciplinary pool of sources, including but not limited to psychology, journalism, political, and business sciences. Our analysis goes beyond mapping the field by applying the results to the socio-ecological model (SEM). The SEM is a holistic framework that is used to explain behavior by considering influences from different levels, such as the interpersonal and institutional level (Ma et al., 2017). Our conceptual framework contributes to an under-theorized field of research and allows for an integrated rather than microscopic perspective, uncovering potential interdependencies of influences as well as identifying gaps in research.

2 Theoretical framework

The concept of resilience is often associated with psychology. However, given its broad meaning of empowering people against risks, it has been applied in many fields, such as economics, psychology, sports, and political sciences, to name only a few (Den Hartigh et al., 2022; Masten, 2011). As Bracke (2016, p. 57) puts it, “in precarious times, resilience is the new security.” As a result, definitions and ascribed meanings differ, depending on the context and scope of use. After all, resilience may relate to anything spanning from cells to persons, organizations, nations, and nature (Southwick et al., 2014).

Despite this heterogeneity, the fundamental idea of resilience remains the same across disciplines. For one, it is always linked to a dawning challenge or threat. Without this component, there is simply no need for resilience (Bracke, 2016). Once confronted with adversity, resilience refers to the ability to withstand, adapt and recover, mitigating potential negative effects (Masten et al., 1990; Sapienza and Masten, 2011). The term is also used to describe the process through which resources are harnessed and obstacles overcome, as well as the outcomes of coping (Liu et al., 2020). As these dynamic understandings demonstrate, resilience as a concept defies binary approaches. The fact that the state of resilience within a system or person is prone to change over contexts and time further exemplifies this (Panter-Brick, 2014; Southwick et al., 2014). All these aspects must be considered when conceptualizing resilience in a new context.

Similarly, disinformation comes in many forms, potentially manifesting as ad-revenue-driven clickbait, memes, manipulated visuals, decontextualized information, “alternative” facts, conspiracy theories, deepfakes, and professionally executed disinformation campaigns, including bots (Kapantai et al., 2021; Marwick, 2018). Previous research defined the term using criteria such as degree of verity of content, content format, and distributor’s intent (Bennett and Livingston, 2018; Wardle, 2018). The most used definition characterizes disinformation by an intention to mislead, to create (ad) revenue, division, or pushing a (political) agenda (Nielsen and Graves 2017; Wardle and Derakhshan, 2017). However, intentionality of the distributor, apart from proving difficult to determine in retrospect, is of lesser relevance for this research as the focus lies on the receiver and factors influencing the process of discarding, internalizing, or sharing. Thus, all the above-named forms of disinformation are of relevance and included in our operationalization.

Resilience and vulnerability to disinformation

Only a few authors have applied the concept of resilience to disinformation (Roozenbeek et al., 2022). Hansen (2017), for example, differentiates between cognitive and physical resilience when describing means to counter information warfare. Cognitive resilience can be understood as the capability to process disinformation in a manner that prevents internalization, comparable to a cognitive “firewall.” Physical resilience aims at obstructing the distribution of disinformation so that it does not reach the user in the first place. Shadow banning and removing content from social media platforms belongs to this category. These two types of resilience go hand in hand. The less physical resilience someone has, the more cognitive resilience is needed, and vice versa (Bjola and Papadakis, 2020). Humprecht and colleagues (2020, p. 497) view resilience in the context of disinformation from a more structural perspective, defining it as “a collective characteristic that transcends the individual level.” This is based on Hall and Lamont’s (2013) understanding of resilience as the capability of groups to overcome adversity. Here, the focus lies on social, political, and informational structures within a country, such as its media system or level of polarization, as crucial influences on levels of resilience (Humprecht, 2018).

Models in resilience research generally differentiate between (1) risks, such as challenges and adversity; (2) competence criteria; and (3) variables of influence, such as protective or inhibiting factors (Masten, 2011). These three components build the foundation for our definition of resilience to disinformation, which focuses on the individual. The adversity component, in this case, is easily identified: disinformation. The competence criteria refer to markers for positive adaptation, in our case, indicators of resilience to disinformation. The variables of influence will be identified through our systematic review of the literature.

We define resilience to disinformation as a capability that manifests in the process of encountering disinformation and results in either questioning or recognizing disinformation and consequently dismissing it. This process is influenced by internal and external factors as well as available resources to the individual. Dismissal can take many forms, including visible signs of objection, both on- and offline, or tacitly, when disinformation is recognized, and, as a consequence, not believed or shared but neither actively rejected. As falsifying or determining the context of (dis-)information is not always possible, we view questioning (dis-)information, which results in non-internalizing and non-sharing (unless for verification purposes), as an indicator of resilience as well. Simply put, resilient persons are not deceived, and do not internalize nor distribute encountered disinformation. Consequently, resilience protects the individual and their environment from the potentially harmful effects of disinformation.

Attempts to understand resilience will always lead to questions about vulnerability. After all, what sets the process of developing resilience in motion are different forms of threats, which lead to vulnerability (Bracke, 2016). Thus, if resilience exists on a continuum, vulnerability is at the opposite end. In disinformation research, the term vulnerability, just as resilience, is used frequently yet loosely, mostly without terminological or conceptual discussion. There seems to be an unspoken understanding of what vulnerability to disinformation entails, as most scholars refer to it in an almost identical manner: not being able to discern, accepting and believing dis- or misinformation, fake news, rumors, or conspiracy theories. Another popular term is susceptibility, which is often used interchangeably with vulnerability (Nisbet and Kamenchuk, 2021; Pennycook and Rand, 2019; Traberg and van der Linden, 2022).

In our understanding, vulnerability to disinformation leads to an implicit or explicit acceptance of disinformation. Just as dismissal, acceptance can take many forms, such as sharing false information due to lack of deliberation, being deceived by, or subconscious internalization. Thus, vulnerable, or susceptible individuals are exposed to potentially harmful consequences of disinformation.

The socio-ecological model

Resilience does not develop in isolation. It emerges through and is shaped by a combination of external and internal factors, events, and circumstances (Garcia-Dia et al., 2013). Therefore, to truly grasp resilience to disinformation, it is necessary to consider the influence of technological, political, social, cultural, economic, legislative, or educational environmental factors (Liu et al., 2017; Masten, 2011). We therefore introduce a framework that allows for delineating the different layers and connected factors of influence. Originating from health sciences, the socio-ecological model (SEM, see Figure 1) is used to explain individuals’ behavior by examining a variety of influences, ranging from micro- to macro-level factors (Bronfenbrenner, 1979; Ma et al., 2017; McLeroy et al., 1988).

Figure 1: The socio-ecological model (adapted from Ma et al., 2017).
Figure 1:

The socio-ecological model (adapted from Ma et al., 2017).

Most models applied in disinformation research focus on the individual and on cognition, neglecting the environments that shape both. The SEM highlights the importance of contextual understanding to explain behavior and acknowledges that effective interventions must consider and target individuals as well as their environment (Robinson, 2008). It emphasizes the interplay rather than the influence of single factors when studying a subject (Upreti et al., 2021). This viewpoint could explain why, for instance, disinformation campaigns focused on transferring knowledge have limited effects. Previous research on health campaigns, for example, has shown that education needs a supportive environment to bear fruits (Sallis et al., 2015). The SEM’s ability to depict such complexities suits our multidisciplinary and comprehensive approach.

3 Method

To ensure transparent and replicable research, we conducted our literature review according to the PRISMA guidelines (PRISMA, 2021), the most adopted guideline for systematic reviews (Batten and Brackett, 2022). We started with a detailed review protocol and pre-test phase, which included an evaluation of the selected search strategy, terms, and databases by three researchers and two librarians. The data was then collected from three databases: Web of Science, due to its large coverage of interdisciplinary journals, Communication and Mass Media Complete, and APA PsycInfo as the leading databases for the two most important fields of disinformation research. Google Scholar was excluded due to its non-transparent algorithm, which impedes replication studies. The search strategy was tailored to each database regarding aspects such as truncations, wildcards, and Boolean operators. We searched for the following terms: (disinformation OR misinformation OR “fake news” OR “conspiracy theor*”) AND (vulnerab* OR resilien* OR susceptib* OR belie* OR *trust* OR shar* OR decepti* OR deceiv* OR endors*). The search terms are based on our definition of disinformation and resilience, as outlined previously, and include synonyms of key terms. We queried articles published from 2011 onwards, as the topic mainly gained traction within the past decade (Kapantai et al., 2021). Figure 2 shows the increase in publications mentioning disinformation during this period, using Web of Science as an example.

Figure 2: Results for “disinformation” on Web of Science between 1999–2023.
Figure 2:

Results for “disinformation” on Web of Science between 1999–2023.

To be included, articles needed to be written in English, published in peer-reviewed journals from 2011 to 2021, and present qualitative or quantitative empirical studies. Articles dedicated to research instrument development were left out due to a lack of explanatory value for the research question. Research related to just one phenomenon, such as misinformation in general, or related topics, such as misinformation correction, was excluded. To ground our framework in empirical studies, we excluded non-empirical work. Studies relying on student or convenience samples were not included due to lack of representativeness, as well as works not disclosing their sampling method.

The data collection was conducted in mid-December 2021 and yielded a total of n = 1586 results. After five deduplication rounds, a title and abstract screening (n = 1451 articles) was performed independently by two researchers for enhanced reliability using the software Rayyan. After the screening, full texts for n = 208 articles were retrieved and assessed for eligibility, out of which n = 94 met the criteria and were included. An updated search in September 2022 included one additional article, leading to a total sample size of n = 95. Out of the excluded works, most did either not meet the sample criteria or did not address the topic. Throughout the whole selection process, each exclusion decision was documented, and discrepancies were jointly resolved.

Figure 3: PRISMA Flow Diagram – study selection.
Figure 3:

PRISMA Flow Diagram – study selection.

Data were extracted using a standardized coding protocol, which was pilot-tested and peer-reviewed by three researchers. The data extraction sheet (see supplementary material) comprised 13 categories encompassing bibliometric information and empirical data. It was developed by synthesizing standard coding categories for systematic reviews, such as author or study outcomes, with tailored categories for more fine-grained analysis, such as research instruments and key measures. To draw insights from the rich dataset, we conducted a qualitative thematic analysis and relied on data visualization tools, such as word clouds, to map regions or keywords. After grouping the data from within our categories, we used an inductive approach, relying on first open and then thematic coding to identify themes and systematically document research results. We decided against a traditional quantitative meta-analysis of results for two main reasons. First, the 95 analyzed studies exhibited large differences in their operationalization of concepts, sample origins, sizes, and levels of representativeness. Second, many studies relied on country-specific and thematically diverse disinformation stimuli, ranging from political false news to AIDS conspiracy theories, further impeding comparison.

4 Results

To fully comprehend and connect study results, the following section gives an overview of the sample by examining the research fields and regions that investigate the topic, popular themes, and applied theories and methods.

Our geospatial analysis confirms the findings of previous studies, identifying the United States as the main source of research, followed by Europe and Australia (Abu Arqoub et al., 2022). Contributing to the U.S.-centeredness of the debate are a considerable number of studies located outside of North America that nevertheless rely on American samples. We found publications from most EU member countries and increased interest from Eastern European countries. Clearly underrepresented regions, on the other hand, are South America, Africa, and large parts of Asia, with Singapore and Malaysia as exceptions.

The topic is often approached from a psychological perspective, with scholars and journals from the field accounting for more than half of all publications. The other half consists of communication, political science, business, and journalism studies. Three main areas of interest emerge from the reviewed literature. A total of 51 % of the articles in our sample investigate belief in conspiracy theories, around 27 % focus on political disinformation, and 19 % revolve around Covid-19 disinformation, including related conspiracy theories. Across all disciplines, the topic is commonly approached by investigating predictors of disinformation belief and sharing.

Figure 4: Factors connected to resilience and vulnerability to disinformation.
Figure 4:

Factors connected to resilience and vulnerability to disinformation.

Only one third of all studies within our sample use established pre-existing theoretical constructs as the basis for their empirical research. Within this group, theories from cognitive psychology prevail. These mainly rely upon the dual process theory, fluency, motivated reasoning, and cognitive dissonance and share one basic assumption: that disinformation is believed, shared, or rejected as a result of cognitive processes. We did not categorize the definition of predictors for disinformation belief and literature-based arguments regarding their influence as theory-based.

A closer look at study designs reveals a homogenous picture, as all except for one study in our sample are of quantitative nature, one third of them using experimental designs. In 75 % of cases, the samples are recruited from crowdsourcing platforms such as Amazon Mechanical Turk. Sample sizes differ, ranging from 100 to 18,000 participants.

Having gained an understanding of the research field, we move on to our main research question, investigating factors that are connected to individuals’ vulnerability or resilience to disinformation. As outlined in the methodology section, direct comparisons of study results are impeded by large differences within our sample. Instead, this section lists and critically examines the factors that have been researched in connection to resilience and vulnerability to disinformation. A detailed account of corresponding studies per factor of influence and study designs can be found in the supplementary material.

Deliberation and cognitive styles. Research on deliberation and cognitive styles traces disinformation beliefs back to thinking processes, for example, by applying the dual process theory. The assumption here is that individuals with a more intuitive thinking style are more prone to believe and share disinformation compared to more analytical or reflective thinkers. Indeed, almost all results confirm this, consistently linking lower scores on the Cognitive Reflection Test (CRT), which are interpreted as reliance on intuition instead of deliberation, to lower resilience to disinformation (e. g., Marques et al., 2022; Nurse et al., 2022). However, regarding disinformation sharing, several studies report no correlation with CRT scores, limiting the potential of cognitive styles (e. g., Buchanan and Kempley, 2021; Nurse et al., 2022). But the field of cognition has more to offer to understand resilience to disinformation, such as cognitive biases. One example is the tendency to accept seemingly meaningful claims uncritically, also referred to as “bullshit receptivity,” which has been repeatedly linked to disinformation beliefs (e. g., Hart and Graether, 2018; Pennycook and Rand, 2020). Similarly, illusory pattern perception, thus seeing patterns in random formations, has been linked to conspiracy theory belief (e. g., van Prooijen et al., 2018).

Pathology. In conspiracy theory studies, pathological traits, such as schizophrenia, are commonly hypothesized as predictors for conspiracy beliefs. Fifty-seven percent of research results from the 12 studies investigating pathologies within our sample confirm these assumptions, linking schizotypy and its lower order facets, such as odd beliefs and magical thinking, delusion proneness, and paranoia, to conspiracy theory belief (e. g., Barron et al., 2018; Georgiou et al., 2019). Mediating factors such as different information processing or levels of self-certainty hint towards possible explanations for these effects. Just as to be expected in the general population, individuals with these psychopathological traits account for small minorities within the samples, significantly limiting the explanatory value of the results for most parts of society.

Political ideology. The fact that political ideology is examined in most studies, independent of the overall topic, points to the shared assumption of scholars that it plays an important role for disinformation beliefs. Studies control for alignment of disinformation beliefs along ideological party lines and differentiate between liberal and conservative participants. Most of them indeed find a correlation between participants’ political ideology and politically congruent disinformation, indicating motivated reasoning (e. g., Anthony and Moulding, 2019; Lawson and Kakkar, 2021). There is, however, another possible explanation for these findings since several studies point to source credibility as a decisive factor in truth assessments. More specifically, information from politically congruent sources, such as, for example, the New York Times for liberals, was rated as more accurate regardless of its actual veracity (Traberg and van der Linden, 2022). This points to partisan-motivated processing of information in general, including disinformation, which in turn is related to the individual’s degree of emotional investment and identification with the respective party. Most simply put, political ideology matters if it causes motivated reasoning or influences deliberation.

Worldview, beliefs, and personality. The included studies survey various factors relating to attitudes, worldviews, and pre-existing beliefs, ranging from traditionalism, anti-intellectualism, and general suspicions to epistemic beliefs and religiosity (e. g., Garrett and Weeks, 2017). Comparisons and generalizations in this domain are futile, as the influence of individual factors is highly context- and content-dependent. For example, attitudes towards vaccines might prove to be a reliable predictor of Covid-19 disinformation beliefs in some cases but completely unrelated to the endorsement of conspiracy theories about 9/11. The only factor that has been consistently linked to conspiracy beliefs is general conspiracist worldviews, an insight that bears little surprise (e. g., Šrol et al., 2021). Religious beliefs, on the other hand, are generally found to have no correlation with disinformation endorsement, as studies find no significant difference between believers and non-believers (e. g., Jasinskaja-Lahti and Jetten, 2019). Personality traits such as the desire to cause chaos, overconfidence, and avoidance coping are found to be linked to higher disinformation belief and sharing, whereas high conscientiousness and discussion heterogeneity preference are linked to lower vulnerability (e. g., Marchlewska et al., 2019; Su, 2021). However, there are no replication or comparable studies to corroborate these findings. It is safe to assume that there is not one personality profile of a person at risk, but rather individual personality traits that have the potential to reinforce tendencies in interaction with external factors, such as the socio-political environment (as seen during Covid-19).

Knowledge. Fourteen studies within our sample examine the connection between pre-existing knowledge and disinformation belief. Of the many forms of knowledge, their focus lies on political, scientific, and health or Covid-19 knowledge, as well as digital and media literacy (e. g., Vegetti and Mancosu, 2020; Zimmermann and Kohring, 2020). Results on the latter differ greatly, in line with previous research (Jones-Jang, 2021; Marwick, 2018). Regarding political disinformation, individuals with higher political knowledge performed better at discerning between real and false news (Rossini et al., 2021). Within the remaining studies, knowledge is assessed with few variables only, partly relying on self-assessment and thus limiting our capacity to draw valid conclusions.

Emotions. Of the studies in our sample, 15 % investigate the influence of emotions on disinformation beliefs or sharing. Most scholars connect conventionally labeled “negative” emotions to vulnerability, investigating anger, anxiety, stress, or feelings of exclusion, with varying results. The findings around anxiety are a good example for showcasing these discrepancies. In the context of political disinformation, anxiety has been shown to decrease partisan processing of (dis-)information (Weeks, 2015). At the same time, health anxiety has been found to increase message importance and thus sharing intentions of health information in general, including disinformation (Oh and Lee, 2019). Thus, context matters, as well as interacting factors, such as for example emotional coping mechanisms, which in turn are shaped by previous personal experiences. The only exception to this ambiguity are studies investigating the effects of (societal) threat and exclusion on conspiracy theory belief. Here, all results link higher perceived threat and feelings of exclusion to higher conspiracy theory belief (e. g., Jolley et al., 2018). Only a few studies investigate the influence of positive emotions, such as entertainment seeking, on susceptibility to disinformation (e. g., van Prooijen et al., 2022).

Media use and exposure to disinformation. Social media use is quickly blamed for its presumed negative influence on disinformation belief and sharing, even though research results on this are highly mixed. As disinformation is popularly distributed via social media, presence on these platforms naturally increases the risk of encountering it. Within the same logic, highly active social media users who frequently share information on their channels have a higher risk of (accidentally) posting false information (Buchanan and Kempley, 2021). Adding to the potential danger are algorithms that present the user with similar information, activating cognitive processes of fluency, where the repetition of information, factual or false, increases believability. Out of the 12 studies that investigate social media use, 50 % find a positive correlation between social media use and disinformation belief or sharing (e. g., Bae, 2020). However, the effects are often explained through mediating factors such as worry or conspiratorial thinking, pointing to the interplay with emotions and beliefs (e. g., Su, 2021). We interpret these findings as evidence of social media’s potential to weaken resilience to disinformation due to its amplifying nature. The use of so-called “alternative media” on the other hand has been found to decrease resilience to disinformation significantly (Humprecht et al., 2021).

Demographics. When identifying vulnerable audiences, relying on factors such as education, income, or age is tempting and probably pointless. According to our analysis, demographic variables have little to no explanatory value for understanding resilience to disinformation. Only three studies within our sample report correlations with age or level of education. However, in all cases, these connections are explained by more influential mediating factors such as knowledge or epistemic beliefs (e. g., Douglas et al., 2016).

Perceived control. A common attempt within media and public discourse to make sense of conspiracy theory beliefs is by referring to the concept of personal control. In this context, conspiracy theories are viewed as a tool to regain a sense of control in an increasingly uncertain and uncontrollable world. Indeed, all five studies investigating the role of control find a correlation between perceived lack of control, feelings of powerlessness, and conspiracy theory belief (e. g., Hart and Graether, 2018). We did not find any results on the role of control for other forms of disinformation.

Meso level

Trust and social environment. The lack of trust in Western societies makes for a popular argument to explain the increased proliferation of disinformation. To avoid inaccurate generalizations, it is essential to differentiate between different forms of trust. The studies within our sample investigate institutional trust, political trust, trust in news sources, interpersonal trust, trust in mainstream media, trust in science, and trust in food safety, with most attention being paid to the former two. Most of the evidence confirms initial assumptions that decreased trust is linked to disinformation belief or sharing, although effect sizes differ (e. g., Hollander, 2018). The mechanisms behind interpersonal and (news) source trust prove more intricate and produce less homogeneous results. High interpersonal trust, for example, such as trust towards a specific person, can increase susceptibility toward a specific conspiracy theory (Green and Douglas, 2018). At the opposite end of the scale, low social trust toward strangers has also been found to increase susceptibility (Hopp et al., 2020). The influence of the social environment on resilience to disinformation receives little attention but bears useful insights, as social media experiments found a significant influence of user comments on readers’ evaluation of (dis-)information (Anspach and Carlson, 2020; Colliander, 2019). The findings illustrate how meso-level influences interact with or can overrule intrapersonal factors such as partisan beliefs.

Macro level

Culture and collective narcissism. Only five studies explore the potential influence of culture on disinformation belief, and even fewer produce valid results. The exception is collective narcissism, which is presumed to increase vulnerability to outgroup conspiracy theories. As a form of ingroup identity, it is characterized by a perceived greatness of the own group and increased hostility towards others. The results regarding collective narcissism align, linking collective narcissism to ideologically aligning conspiracy theories (e. g., Cichocka et al., 2016; Marchlewska et al., 2019).

Socio-political and informational environment. In general, macro-level factors receive much less scholarly attention within our sample. Two cross-national studies focus on the influence of structural factors and find high levels of polarization and populism to be connected to higher vulnerability to disinformation. Trust in and use of mainstream and public service media, on the other hand, were not connected to levels of resilience (Humprecht et al., 2020; Humprecht et al., 2021).

5 Conclusion

Our study presents a conceptual framework on resilience to disinformation, provides an overview of the research field, and identifies key factors associated with resilience and vulnerability to disinformation through a systematic review. Our findings enable us to identify gaps and provide direction for future research.

The studies within our sample mainly originate from North America or Western Europe, thereby inherently reflecting the political context, priorities, and implicit assumptions prevailing within these geographical regions. This could be an outcome of our methodological choices, as, for example, the journals included in our databases of choice might be less accessible or common in other regions. Future research explicitly focused on South American, African, Asian, and Middle Eastern perspectives is needed to complement this.

Another indicator of homogeneity within the research field is the choice of methodology and sampling methods. The overwhelming majority adopt a quantitative approach and mainly sample from crowdsourcing platforms, which raises questions regarding representativeness. It also points to a lack of insight into the underlying motivations and circumstances that lead to disinformation endorsement. Mixed methods or qualitative approaches could provide meaning to thus far discovered correlates and uncover more latent factors and processes. Qualitative interviews, for instance, would allow to explore processes of resilience inductively and provide more in-depth insight into how individuals interpret and navigate (dis)information.

In our framework and analysis, we differentiate between micro-, meso-, and macro-level factors of influence on resilience to disinformation. At the micro level, political ideology, cognitive processes, and pathologies are considered the prime drivers of vulnerability to disinformation. As a substantial part of disinformation is of political nature, it is logical to find connected correlates embedded in empirical research. The prevalence of cognitive and pathological measures can partly be attributed to the availability of existing research instruments. However, if cognitive abilities and mental illnesses are first to be explored in relation to vulnerability to disinformation, this also points to implicit biases. We are not the first to notice this, and our evidence emphasizes the importance of being conscious of implicit value judgments and the importance of understanding instead of labeling subjects (Harambam, 2020).

Findings that reflective thinking has a positive influence on resilience to disinformation are promising but come with limitations. We find inconsistencies in the understanding of what the Cognitive Reflection Test (CRT), the most used instrument for measuring cognitive styles, intends to measure, ranging from “analytical thinking” to “cognitive reflection” or “cognitive ability” (Buchanan and Kempley, 2021; Marques et al., 2022; Tandoc et al., 2021). Some authors acknowledge its constraints, pointing out that it is unclear whether the CRT assesses analytic thinking or simply reflects numeracy or general cognitive ability (Nurse et al., 2022). These inconsistencies limit the overall validity and generalizability of results. In addition, we should not equate improved truth discernment, resulting from deliberation, with resilience to disinformation. After all, research results show that higher CRT results do not necessarily lead to better discernment when sharing disinformation (Nurse et al., 2022). For example, individual abilities to discern disinformation could be overruled when content is shared solely for entertainment purposes, connecting with peers, or expressing belonging to a certain group. Based on our framework, we conclude that meso-level factors, for example, relating to the social environment, might help explain the differences between abilities and behavior and should be explored in future research.

Overall, the studies within our sample pay little attention to meso- and macro-level influences on resilience to disinformation, leaving questions regarding the role of social, educational, or political environments unanswered. This points to a significant gap, as experiences and environments undeniably shape the development of resilience and thus need to be considered. Our results further emphasize the importance of context. Believing Covid-19 disinformation during the pandemic might be motivated by different factors than believing 9/11 conspiracy theories. Moreover, while different forms of disinformation can be believed for the same reasons, our evidence shows that the type of content, in addition to the context, matters. Lastly, our research shows that concepts related to disinformation are undertheorized, which is exemplified by the fact that only one third of studies use established theories and concepts. Our framework aims to contribute to the conceptual groundwork in the field.

Based on our sample and analysis, we cannot quantitatively compare study results or effect sizes within our sample. This limits our ability to make statements on the exact influence of different factors on resilience to disinformation. Additionally, since research based on convenience and student samples was not included in the systematic review, our list of factors connected to resilience to disinformation might have potential for extension.

As a multi-layered issue, disinformation research benefits from approaches that allow for complexity. By integrating different levels, our conceptual framework of resilience to disinformation provides a meta-level perspective, which can aid researchers in identifying mediating factors, explaining unexpected effects, and contextualizing their results. It also provides a possible roadmap for more extensive, multi-factor research, which would greatly contribute to a research field that currently mainly focuses on the influence of single, micro-level factors. Research results show that influencing factors act additively, further emphasizing the importance of more comprehensive approaches. Some scholars already acknowledge this and attempt to provide a more holistic view of disinformation (Manuvie and van Dorssen, 2021; Chadwick and Stanyer, 2022). In the end, grasping resilience to disinformation is like crafting a complex mosaic where both the individual parts and the bigger picture need to be monitored simultaneously.

About the authors

Jülide Kont

Jülide Kont is a Ph.D. candidate and lecturer at the Hanze University of Applied Sciences and the University of Groningen. Her research focuses on disinformation from a cross-national and comparative perspective.

Wim Elving

Wim J.L. Elving is a professor of applied sciences at EnTranCe, Centre of Expertise, Energy at Hanze University of Applied Sciences in Groningen. His research interests include CSR communication, branding, change and transitions, and sustainability communications.

Marcel Broersma

Marcel Broersma is a professor of Media and Journalism Studies in The Centre for Media and Journalism Studies at the University of Groningen. His research interests include the interface between the digital transformation of journalism, social media, changing media use, and digital literacy and inclusion.

Çiğdem Bozdağ

Çiğdem Bozdağ is an assistant professor at the Centre for Media and Journalism Studies at the University of Groningen. Her research interests include digital media use, digital literacy, digital inclusion, media education in schools, media and migration.

Acknowledgments

We want to thank Joost Driesens from the University of Groningen and Marleen Poot from the Hanze University of Applied Sciences for their support and guidance in developing the search strategy for our systematic review. Your expertise helped to build a solid foundation for the whole research project.

Funding acknowledgment

The author(s) received no financial support for the research, authorship, and/or publication of this article.

Declaration of conflicting interests

The author(s) declare no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

References

Abu Arqoub, O., Abdulateef Elega, A., Efe Özad, B., Dwikat, H., & Adedamola Oloyede, F. (2022). Mapping the scholarship of fake news research: A systematic review. Journalism Practice, 16(1), 56–86. https://doi.org/10.1080/17512786.2020.180579110.1080/17512786.2020.1805791Search in Google Scholar

Alvarez-Galvez, J., Suarez-Lledo, V., & Rojas-Garcia, A. (2021). Determinants of infodemics during disease outbreaks: A systematic review. Frontiers in Public Health, 9, 603603–603603. https://doi.org/10.3389/fpubh.2021.60360310.3389/fpubh.2021.603603Search in Google Scholar

Anderson, C. W. (2021). Fake news is not a virus: On platforms and their effects. Communication Theory, 31(1), 42–61. https://doi.org/10.1093/ct/qtaa00810.1093/ct/qtaa008Search in Google Scholar

Anspach, N. M., & Carlson, T. N. (2020). What to believe? Social media commentary and belief in misinformation. Political Behavior, 42(3), 697–718. https://doi.org/10.1007/s11109-018-9515-z10.1007/s11109-018-9515-zSearch in Google Scholar

Anthony, A., & Moulding, R. (2019). Breaking the news: Belief in fake news and conspiracist beliefs. Australian Journal of Psychology, 71(2), 154–162. https://doi.org/10.1111/ajpy.1223310.1111/ajpy.12233Search in Google Scholar

Bae, S. Y. (2020). The social mediation of political rumors: Examining the dynamics in social media and belief in political rumors. Journalism, 21(10), 1522–1538. https://doi.org/10.1177/146488491772265710.1177/1464884917722657Search in Google Scholar

Barron, D., Furnham, A., Weis, L., Morgan, K. D., Towell, T., & Swami, V. (2018). The relationship between schizotypal facets and conspiracist beliefs via cognitive processes. Psychiatry Research, 259, 15–20. https://doi.org/10.1016/j.psychres.2017.10.00110.1016/j.psychres.2017.10.001Search in Google Scholar

Batten, J., & Brackett, A. (2022). Ensuring rigor in systematic reviews: Part 6, reporting guidelines. Heart & Lung: The Journal of Cardiopulmonary and Acute Care, 52, 22–25. https://doi.org/10.1016/j.hrtlng.2021.11.00210.1016/j.hrtlng.2021.11.002Search in Google Scholar

Bennett, W. L., & Livingston, S. (2018). The disinformation order: Disruptive communication and the decline of democratic institutions. European Journal of Communication, 33(2), 122–139. https://doi.org/10.1177/026732311876031710.1177/0267323118760317Search in Google Scholar

Bjola, C., & Papadakis, K. (2020). Digital propaganda, counterpublics and the disruption of the public sphere: The Finnish approach to building digital resilience. Cambridge Review of International Affairs, 33(5), 638–666. https://doi.org/10.1080/09557571.2019.170422110.1080/09557571.2019.1704221Search in Google Scholar

Bojic, L., Nikolic, N., & Tucakovic, L. (2023). State vs. anti-vaxxers: Analysis of Covid-19 echo chambers in Serbia. Communications: The European Journal of Communication Research, 48(2), 273–291. https://doi.org/10.1515/commun2021-010410.1515/commun-2021-0104Search in Google Scholar

Bracke, S. (2016). Bouncing back: Vulnerability and resistance in times of resilience. In J. Butler, Z. Gambetti, & L. Sabsay (Eds.), Vulnerability in resistance. Duke University Press. https://doi.org/10.1215/9780822373490-00410.2307/j.ctv11vc78r.8Search in Google Scholar

Bronfenbrenner, U. (1979). The ecology of human development: Experiments by nature and design. Harvard University Press.10.4159/9780674028845Search in Google Scholar

Bryanov, K., & Vziatysheva, V. (2021). Determinants of individuals’ belief in fake news: A scoping review determinants of belief in fake news. Plos One, 16(6), 0253717. https://doi.org/10.1371/journal.pone.025371710.1371/journal.pone.0253717Search in Google Scholar

Buchanan, T., & Kempley, J. (2021). Individual differences in sharing false political information on social media: Direct and indirect effects of cognitive-perceptual schizotypy and psychopathy. Personality and Individual Differences, 182, 1–11. https://doi.org/10.1016/j.paid.2021.11107110.1016/j.paid.2021.111071Search in Google Scholar

Chadwick, A., & Stanyer, J. (2022). Deception as a bridging concept in the study of disinformation, misinformation, and misperceptions: Toward a holistic framework. Communication Theory, 32(1), 1–24. https://doi.org/10.1093/ct/qtab01910.1093/ct/qtab019Search in Google Scholar

Cichocka, A., Marchlewska, M., Golec de Zavala, A., & Olechowski, M. (2016). ‘They will not control us’: Ingroup positivity and belief in intergroup conspiracies. British Journal of Psychology, 107(3), 556–76. https://doi.org/10.1111/bjop.1215810.1111/bjop.12158Search in Google Scholar

Colliander, J. (2019). ‘This is fake news’: Investigating the role of conformity to other users’ views when commenting on and spreading disinformation in social media. Computers in Human Behavior, 97, 202–215. https://doi.org/10.1016/j.chb.2019.03.03210.1016/j.chb.2019.03.032Search in Google Scholar

Corbu, N., Buturoiu, R., Frunzaru, V. & Guiu, G. (2023). Vaccine-related conspiracy and counter-conspiracy narratives. Silencing effects. Communications: The European Journal of Communication Research, 0(0), 1–22. https://doi-org.proxy-ub.rug.nl/10.1515/commun-2022-0022Search in Google Scholar

Den Hartigh, R. J. R., Meerhoff, L. R. A., Van Yperen, N. W., Neumann, N. D., Brauers, J. J., Frencken, W. G. P., Emerencia, A., Hill, Y., Platvoet, S., Atzmueller, M., Lemmink, K. A. P. M., & Brink, M. S. (2022). Resilience in sports: A multidisciplinary, dynamic, personalized perspective. International Review of Sport and Exercise Psychology, 0(0), 1–23. https://doi.org/10.1080/1750984X.2022.203974910.1080/1750984X.2022.2039749Search in Google Scholar

Douglas, K. M., Sutton, R. M., Callan, M. J., Dawtry, R. J., & Harvey, A. J. (2016). Someone is pulling the strings: Hypersensitive agency detection and belief in conspiracy theories. Thinking & Reasoning, 22(1), 57–77. https://doi.org/10.1080/13546783.2015.105158610.1080/13546783.2015.1051586Search in Google Scholar

Garcia-Dia, M. J., DiNapoli, J. M., Garcia-Ona, L., Jakubowski, R., & O’Flaherty, D. (2013). Concept analysis: Resilience. Archives of Psychiatric Nursing, 27(6), 264–70. https://doi.org/10.1016/j.apnu.2013.07.00310.1016/j.apnu.2013.07.003Search in Google Scholar

Garrett, R. K., & Weeks, B. E. (2017). Epistemic beliefs’ role in promoting misperceptions and conspiracist ideation. Plos One, 12(9). https://doi.org/10.1371/journal.pone.018473310.1371/journal.pone.0184733Search in Google Scholar

Georgiou, N., Delfabbro, P., & Balzan, R. (2019). Conspiracy beliefs in the general population: The importance of psychopathology, cognitive style and educational attainment. Personality and Individual Differences, 151, 1–7. https://doi.org/10.1016/j.paid.2019.10952110.1016/j.paid.2019.109521Search in Google Scholar

Green, R., & Douglas, K. M. (2018). Anxious attachment and belief in conspiracy theories. Personality and Individual Differences, 125, 30–37. https://doi.org/10.1016/j.paid.2017.12.02310.1016/j.paid.2017.12.023Search in Google Scholar

Hall, P. A., & Lamont, M. (Eds.). (2013). Social resilience in the neoliberal era. Cambridge University Press.10.1017/CBO9781139542425Search in Google Scholar

Hansen, F. S. (2017). Russian hybrid warfare: A study of disinformation (No. 2017: 06). DIIS Report. http://pure.diis.dk/ws/files/950041/DIIS_RP_2017_6_web.pdfSearch in Google Scholar

Harambam, J. (2020). Contemporary conspiracy culture: Truth and knowledge in an era of epistemic instability. Routledge.10.4324/9780429327605Search in Google Scholar

Hart, J., & Graether, M. (2018). Something’s going on here: Psychological predictors of belief in conspiracy theories. Journal of Individual Differences, 39(4), 229–237. https://doi.org/10.1027/1614-0001/a00026810.1027/1614-0001/a000268Search in Google Scholar

Hollander, B. A. (2018). Partisanship, individual differences, and news media exposure as predictors of conspiracy beliefs. Journalism and Mass Communication Quarterly, 95(3), 691–713. https://doi.org/10.1177/107769901772891910.1177/1077699017728919Search in Google Scholar

Hopp, T., Ferrucci, P., & Vargo, C. J. (2020). Why do people share ideologically extreme, false, and misleading content on social media? A self-report and trace data-based analysis of countermedia content dissemination on Facebook and Twitter. Human Communication Research, 46(4), 357–384. https://doi.org/10.1093/hcr/hqz02210.1093/hcr/hqz022Search in Google Scholar

Humprecht, E. (2018). Where ‘fake news’ flourishes: A comparison across four western democracies. Information, Communication & Society, 22(13), 1973–1988. https://doi.org/10.1080/1369118x.2018.147424110.1080/1369118X.2018.1474241Search in Google Scholar

Humprecht, E., Esser, F., & Aelst, P. V. (2020). Resilience to online disinformation: A framework for cross-national comparative research. The International Journal of Press/Politics, 25(3), 493–516. https://doi.org/10.1177/194016121990012610.1177/1940161219900126Search in Google Scholar

Humprecht, E., Esser, F., Aelst, P. V., Staender, A., & Morosoli, S. (2023). The sharing of disinformation in cross-national comparison: Analyzing patterns of resilience. Information, Communication & Society, 26(7), 1342–1362. https://doi.org/10.1080/1369118X.2021.200674410.1080/1369118X.2021.2006744Search in Google Scholar

Janmohamed, K., Walter, N., Nyhan, K., Khoshnood, K., Tucker, J. D., Sangngam, N., Altice, F. L., Ding, Q., Wong, A., Schwitzky, Z. M., Bauch, C. T., De Choudhury, M., Papakyriakopoulos, O., & Kumar, N. (2021). Interventions to mitigate Covid-19 misinformation: A systematic review and meta-analysis. Journal of Health Communication, 26(12), 846–857. https://doi.org/10.1080/10810730.2021.202146010.1080/10810730.2021.2021460Search in Google Scholar

Jasinskaja‐Lahti, I., & Jetten, J. (2019). Unpacking the relationship between religiosity and conspiracy beliefs in Australia. British Journal of Social Psychology, 58(4), 938–954. https://doi.org/10.1111/bjso.1231410.1111/bjso.12314Search in Google Scholar

Jolley, D., Douglas, K. M., & Sutton, R. M. (2018). Blaming a few bad apples to save a threatened barrel: The system‐justifying function of conspiracy theories. Political Psychology, 39(2), 465–478. https://doi.org/10.1111/pops.1240410.1111/pops.12404Search in Google Scholar

Jones-Jang, S. M., Mortensen, T., & Liu, J. (2021). Does media literacy help identification of fake news? Information literacy helps, but other literacies don’t. American Behavioral Scientist, 65(2), 371–388. https://doi.org/10.1177/000276421986940610.1177/0002764219869406Search in Google Scholar

Kapantai, E., Christopoulou, A., Berberidis, C., & Peristeras, V. (2021). A systematic literature review on disinformation: Toward a unified taxonomical framework. New Media & Society, 23(5), 1301–1326. https://doi.org/10.1177/146144482095929610.1177/1461444820959296Search in Google Scholar

Klebba, L., & Winter, S. (2023). Crisis alert: (Dis)information selection and sharing in the COVID-19 pandemic. Communications: The European Journal of Communication Research, 0(0), 1–21. https://doi-org.proxy-ub.rug.nl/10.1515/commun-2022-0020Search in Google Scholar

Lawson, M. A., & Kakkar, H. (2021). Of pandemics, politics, and personality: The role of conscientiousness and political ideology in the sharing of fake news. Journal of Experimental Psychology: General, 151(5), 1154–1177. https://doi.org/10.1037/xge000112010.1037/xge0001120Search in Google Scholar

Liu, J. J. W., Reed, M., & Fung, K. P. (2020). Advancements to the multi-system model of resilience: Updates from empirical evidence. Heliyon, 6(9), 1–7. https://doi.org/10.1016/j.heliyon.2020.e0483110.1016/j.heliyon.2020.e04831Search in Google Scholar

Liu, J. J. W., Reed, M., & Girard, T. A. (2017). Advancing resilience: An integrative, multi system model of resilience. Personality and Individual Differences, 111, 111–118. https://doi.org/10.1016/j.paid.2017.02.00710.1016/j.paid.2017.02.007Search in Google Scholar

Ma, P. H. X., Chan, Z. C. Y., & Loke, A. Y. (2017). The socio-ecological model approach to understanding barriers and facilitators to the accessing of health services by sex workers: A systematic review. AIDS and Behavior 21(8), 2412–2438. https://doi.org/10.1007/s10461-017-1818-210.1007/s10461-017-1818-2Search in Google Scholar

Manuvie, R., & van Dorssen, M. (2021). Digital wildfire of disinformation in the Netherlands. The London Story.Search in Google Scholar

Marchlewska, M., Cichocka, A., Łozowski, F., Górska, P., & Winiewski, M. (2019). In search of an imaginary enemy: Catholic collective narcissism and the endorsement of gender conspiracy beliefs. The Journal of Social Psychology, 159(6), 766–779. https://doi.org/10.1080/00224545.2019.158663710.1080/00224545.2019.1586637Search in Google Scholar

Marques, M. D., Ling, M., Williams, M. N., Kerr, J. R., & McLennan, J. (2022). Australasian public awareness and belief in conspiracy theories: Motivational correlates. Political Psychology, 43(1), 177–198. https://doi.org/10.1111/pops.1274610.1111/pops.12746Search in Google Scholar

Marwick, A. E. (2018). Why do people share fake news? A sociotechnical model of media effects. Georgetown Law Technology Review, 2(2), 474–512.Search in Google Scholar

Marwick, A. E., & Lewis, B. (2017). Media manipulation and disinformation online. Data & Society Research Institute. https://datasociety.net/library/media-manipulation-and-disinfo-online/Search in Google Scholar

Masten, A. S., Best, K. M., & Garmezy, N. (1990). Resilience and development: Contributions from the study of children who overcome adversity. Development and psychopathology, 2(4), 425–444. https://doi.org/10.1017/S095457940000581210.1017/S0954579400005812Search in Google Scholar

Masten, A. S. (2011). Resilience in children threatened by extreme adversity: Frameworks for research, practice, and translational synergy. Development and Psychopathology, 23(2), 493–506. https://doi.org/10.1017/S095457941100019810.1017/S0954579411000198Search in Google Scholar

McLeroy, K. R., Bibeau, D., Steckler, A., & Glanz, K. (1988). An ecological perspective on health promotion programs. Health Education Quarterly, 15(4), 351–77. https://doi.org/10.1177/10901981880150040110.1177/109019818801500401Search in Google Scholar

Meese, J., Frith, J., & Wilken, R. (2020). Covid-19, 5G conspiracies and infrastructural futures. Media International Australia, 177 (1), 30–46. https://doi.org/10.1177/1329878X2095216510.1177/1329878X20952165Search in Google Scholar

Miller, M. L., & Vaccari, C. (2020). Digital threats to democracy: Comparative lessons and possible remedies. International Journal of Press/Politics, 25(3), 333–356. https://doi.org/10.1177/194016122092232310.1177/1940161220922323Search in Google Scholar

Nielsen, R., & Graves, L. (2017). “News you don’t believe”: Audience perspectives on fake news. Reuters Institute for the Study of Jounalism. https://ora.ox.ac.uk/objects/uuid:6eff4d14-bc72-404d-b78a-4c2573459ab8Search in Google Scholar

Nisbet, E. C., & Kamenchuk, O. (2021). Russian news media, digital media, informational learned helplessness, and belief in covid-19 misinformation. International Journal of Public Opinion Research, 33(3), 571–590. https://doi.org/10.1093/ijpor/edab01110.1093/ijpor/edab011Search in Google Scholar

Nurse, M. S., Ross, R. M., Isler, O., & Van Rooy, D. (2022). Analytic thinking predicts accuracy ratings and willingness to share Covid-19 misinformation in Australia. Memory & Cognition, 50(2), 1–10. https://doi.org/10.3758/s13421-021-01219-510.3758/s13421-021-01219-5Search in Google Scholar

Oh, H. J., & Lee, H. (2019). When do people verify and share health rumors on social media? The effects of message importance, health anxiety, and health literacy. Journal of Health Communication, 24(11), 837–847. https://doi.org/10.1080/10810730.2019.167782410.1080/10810730.2019.1677824Search in Google Scholar

Panter-Brick, C. (2014). Health, risk, and resilience: Interdisciplinary concepts and applications. Annual Review of Anthropology, 43(1), 431–448. https://doi.org/10.1146/annurev-anthro-102313-02594410.1146/annurev-anthro-102313-025944Search in Google Scholar

Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188, 39–50. https://doi.org/10.1016/j.cognition.2018.06.01110.1016/j.cognition.2018.06.011Search in Google Scholar

Pennycook, G., & Rand, D. G. (2020). Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. Journal of Personality, 88(2), 185–200. https://doi.org/10.1111/jopy.1247610.1111/jopy.12476Search in Google Scholar

PRISMA. (2021, November 8). PRISMA Statement. Retrieved November 8, 2021 from http://www.prisma-statement.org/PRISMAStatement/Search in Google Scholar

Robinson, T. (2008). Applying the socio-ecological model to improving fruit and vegetable intake among low-income African Americans. Journal of Community Health, 33(6), 395–406. https://doi.org/10.1007/s10900-008-9109-510.1007/s10900-008-9109-5Search in Google Scholar

Roozenbeek, J., van der Linden, S., Goldberg, B., Rathje, S., & Lewandowsky, S. (2022). Psychological inoculation improves resilience against misinformation on social media. Science Advances, 8(34), 1–11. https://doi.org/10.1126/sciadv.abo625410.1126/sciadv.abo6254Search in Google Scholar

Rossini, P., Stromer-Galley, J., Baptista, E. A., & Veiga de Oliveira, V. (2021). Dysfunctional information sharing on WhatsApp and Facebook: The role of political talk, cross-cutting exposure and social corrections. New Media & Society, 23(8), 2430–2451. https://doi.org/10.1177/146144482092805910.1177/1461444820928059Search in Google Scholar

Sallis, J. F., & Owen, N. (2015). Ecological Models of Health Behavior. In Health behavior: Theory, research, and practice (5th ed., pp. 43–64). Jossey-Bass.Search in Google Scholar

Sapienza, J. K., & Masten, A. S. (2011). Understanding and promoting resilience in children and youth. Current Opinion in Psychiatry, 24(4), 267–273. https://doi.org/10.1097/YCO.0b013e32834776a810.1097/YCO.0b013e32834776a8Search in Google Scholar

Sindermann, C., Cooper, A., & Montag, C. (2020). A short review on susceptibility to falling for fake political news. Current Opinion in Psychology, 36, 44–48. https://doi.org/10.1016/j.copsyc.2020.03.01410.1016/j.copsyc.2020.03.014Search in Google Scholar

Southwick, S. M., Bonanno, G. A., Masten, A. S., Panter-Brick, C., & Yehuda, R. (2014). Resilience definitions, theory, and challenges: Interdisciplinary perspectives. European Journal of Psychotraumatology, 5(1), 25338. https://doi.org/10.3402/ejpt.v5.2533810.3402/ejpt.v5.25338Search in Google Scholar

Šrol, J., Ballová Mikušková, E., & Čavojová, V. (2021). When we are worried, what are we thinking? Anxiety, lack of control, and conspiracy beliefs amidst the COVID‐19 pandemic. Applied Cognitive Psychology, 35(3), 720–729. https://doi.org/10.1002/acp.379810.1002/acp.3798Search in Google Scholar

Su, Y. (2021). It doesn’t take a village to fall for misinformation: Social media use, discussion heterogeneity preference, worry of the virus, faith in scientists, and COVID-19-related misinformation beliefs. Telematics & Informatics, 58. https://doi.org/10.1016/j.tele.2020.10154710.1016/j.tele.2020.101547Search in Google Scholar

Tandoc, E. C., Lee, J., Chew, M., Tan, F. X., & Goh, Z. H. (2021). Falling for fake news: The role of political bias and cognitive ability. Asian Journal of Communication, 31(4), 237–253. https://doi.org/10.1080/01292986.2021.194114910.1080/01292986.2021.1941149Search in Google Scholar

Traberg, C. S., & van der Linden, S. (2022). Birds of a feather are persuaded together: Perceived source credibility mediates the effect of political bias on misinformation susceptibility. Personality and Individual Differences, 185, 2–15. https://doi.org/10.1016/j.paid.2021.11126910.1016/j.paid.2021.111269Search in Google Scholar

Upreti, Y. R., Bastien, S., Bjønness, B., & Devkota, B. (2021). The socio-ecological model as a framework for understanding junk food consumption among schoolchildren in Nepal. Nutrition and Health, 27(3), 337–346. https://doi.org/10.1177/0260106021100016910.1177/02601060211000169Search in Google Scholar

van Prooijen, J. W., Douglas, K. M., & De Inocencio, C. (2018). Connecting the dots: Illusory pattern perception predicts belief in conspiracies and the supernatural. European Journal of Social Psychology, 48(3), 320–335. https://doi.org/10.1002/ejsp.233110.1002/ejsp.2331Search in Google Scholar

van Prooijen, J. W., Ligthart, J., Rosema, S., & Xu, Y. (2022). The entertainment value of conspiracy theories. British Journal of Psychology, 113(1), 25–48. https://doi.org/10.1111/bjop.1252210.1111/bjop.12522Search in Google Scholar

Vegetti, F., & Mancosu, M. (2020). The impact of political sophistication and motivated reasoning on misinformation. Political Communication, 37(5), 678–695. https://doi.org/10.1080/10584609.2020.174477810.1080/10584609.2020.1744778Search in Google Scholar

Wardle, C., & Derakhshan, H. (2017). INFORMATION DISORDER: Toward an interdisciplinary framework for research and policy making (pp. 1–107) [Council of Europe report DGI]. Council of Europe. https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.htmlSearch in Google Scholar

Wardle, C. (2018). The need for smarter definitions and practical, timely empirical research on information disorder. Digital Journalism, 6(8), 951–963. https://doi.org/10.1080/21670811.2018.150204710.1080/21670811.2018.1502047Search in Google Scholar

Weeks, B. E. (2015). Emotions, partisanship, and misperceptions: How anger and anxiety moderate the effect of partisan bias on susceptibility to political misinformation. Journal of Communication, 65(4), 699–719. https://doi.org/10.1111/jcom.1216410.1111/jcom.12164Search in Google Scholar

Zimmermann, F., & Kohring, M. (2020). Mistrust, disinforming news, and vote choice: A panel survey on the origins and consequences of believing disinformation in the 2017 German parliamentary election. Political Communication, 37(2), 215–237. https://doi.org/10.1080/10584609.2019.168609510.1080/10584609.2019.1686095Search in Google Scholar

Published Online: 2024-05-16
Published in Print: 2025-05-28

© 2024 the author(s), published by De Gruyter.

This work is licensed under the Creative Commons Attribution 4.0 International License.

Articles in the same Issue

  1. Titelseiten
  2. Articles
  3. Communication and academic burnout: The effects of social support and participation in decision-making
  4. With time comes trust? The development of misinformation perceptions related to COVID-19 over a six-month period: Evidence from a five-wave panel survey study in the Netherlands
  5. A qualitative examination of (political) media diets across age cohorts in five countries
  6. Oldies but goldies? Comparing the trustworthiness and credibility of ‘new’ and ‘old’ information intermediaries
  7. Life online during the pandemic : How university students feel about abrupt mediatization
  8. Publishing strategies and professional demarcations: Enacting media logic(s) in European academic climate communication through open letters
  9. International cooperation on (counter)publics between tradition and reorientation: Social democracy and its media in the Cold War era
  10. The Silicon Valley paradox: A qualitative interview study on the social, cultural, and ideological foundations of a global innovation center
  11. Quality and conflicts of communication consulting: Demystifying the concept and current practices based on a study of consultants and clients across Europe
  12. Hate speech mainstreaming in the Greek virtual public sphere: A quantitative and qualitative approach
  13. Examining the spread of disinformation on Facebook during the first wave of the Covid-19 pandemic: A case study in Switzerland
  14. COVID-19 vaccine reviews on YouTube: What do they say?
  15. It’s the political economy after all: Implications of the case of Israel’s media system transition on the theory of media systems
  16. Periods of upheaval and their effect on mediatized ways of life: Changes in media use in the wake of separation, new partnership, children leaving the parental home, and relocation
  17. Solving the crisis with “do-it-yourself heroes”? The media coverage on pioneer communities, Covid-19, and technological solutionism
  18. What makes audiences resilient to disinformation? Integrating micro, meso, and macro factors based on a systematic literature review
  19. “That’s just, like, your opinion” – European citizens’ ability to distinguish factual information from opinion
  20. Book reviews
  21. Cuelenaere, E., Willems, G., & Joye, S. (Eds.) (2021). European film remakes. Edinburgh University Press. https://doi.org/10.1515/9781474460668. 272 pp.
  22. Cushion, S. (2024). Beyond mainstream media: Alternative media and the future of journalism. Routledge. https://doi.org/10.4324/9781003360865. 193 pp.
  23. Frau-Meigs, D., & Corbu, N. (2024). Disinformation debunked: Building resilience through media and information literacy. Routledge. 328 pp. https://doi.org/10.4324/9781003387404
Downloaded on 17.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/commun-2023-0078/html
Scroll to top button