It’s Not the Facebook Access, but the Partisan Bias which Predict Belief in Misinformation: The Case of 2019 Indonesia Presidential Election

This study aims to understand the role of Facebook access and partisan bias on the belief in misinformation in the political context of the 2019 Presidential Election. Frequent use of Facebook and partisan bias for presidential candidates were predicted to influence belief in misinformation about illegal migrant workers from China in Indonesia. Using a structured questionnaire, a total of 1,818 participants who were representative of the Indonesian voter population were interviewed asking about their frequency of Facebook use, political support, awareness, and belief in misinformation about thousands of illegal migrant workers from China, as well as other demographic variables as part of national survey questions. Of these, there were 804 participants who were aware of misinformation about illegal migrant workers from China to be analyzed. The results of binomial logistic regression analysis showed that partisan bias significantly affected belief in misinformation—Subianto's (vs. Widodo's) supporters significantly have (vs. have not) a belief in the misinformation, whereas the frequency of Facebook usage and the effect of their interactions were not significant. This finding shows the strength of the influence of political support on belief in misinformation and the need to further study the influence of social media in Indonesia's political context. Bukan Akes Facebook, Tetapi Bias Partisan yang Memprediksi Kepercayaan pada Misinformasi: Kasus Pilpres Indonesia 2019


Introduction
After the 2016 US Presidential Election, misinformation in the context of electoral politics has attracted research interest (Allcott & Gentzkow, 2017;Lippman, Samuelsohn, & Arnsdorf, 2016). The results of the study by Lippman et al. (2016) showed as much as one misstatement every 5 min on average from Donald Halida 155

Makara Hubs-Asia
December 2020 ½Vol. 24 ½ No. 2 Trump's speeches and press conferences. This means that voters are exposed to misinformation and vulnerable to be affected to make wrong decisions. Research on misinformation and the factors that influence it are therefore crucial as part of efforts to minimize its negative effects.
Social media is referred to as a factor that helps spread misinformation to many more individuals. In the context of the 2016 presidential election in the United States, as many as 380 million still share false news or misinformation, and 760 million users are clicking and reading false information, which is equivalent to three misinformation read by American adults (Allcott & Gentzkow, 2017). This widely shared misinformation has been attributed to the content of that misinformation using personally and emotionally targeted news (Bakir & McStay, 2017). One of the social media applications that got public criticism for the spread of misinformation is Facebook. Studies have found that Facebook was one of the primary sources of fake news; for example, Fourney, Racz, Ranade, Mobius, and Horvitz (2017) found that 68% of page visit to fake news domain was from social media, and of these, 99% referrals were from Facebook (Fourney, et al., 2017). This raises new concerns about the influence of social media (e.g., Facebook) on the political process and democracy. To show concern about the negative effect of social media on democracy, the terms such as echo chamber and filter bubble (Pariser, 2011;Sunstein, 2001) have been used. Because of the personalization and homogeneity of the environment it provided to the users, the social media was considered as the source of information bias, increasing polarization and enhancing people's belief in misinformation that were previously received.
The misinformation cannot be easily corrected by providing evidence against it. Studies found that even after retraction, people still rely on the misinformation they had already believe-known as continued-influence effect of misinformation (Ecker et al., 2011;Ecker, Hogan, & Lewandowsky, 2017;Johnson & Seifert, 1994;Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012). The reliance on the retracted misinformation was reflected in the form of memory that people had about the misinformation and its effect on later inferences (Johnson & Seifert, 1994). If the misinformation was related to belief strongly held by people, this phenomenon could be stronger. Individuals in this case could apply motivated cognition that led them to exclude evidence that is contrary to their attitude, and vice versa, and accept evidence that is consistent with their views (Kunda, 1990;Lord, Ross, & Lepper, 1979). As a result, people could further have a stronger belief in the misinformation and misconceptions after retraction of the misinformation, a phenomenon called a worldview backfire effect. However, another stream of research found that people could also develop ways to cope with the media environment that perceived to be filled with fake news or misinformation, for example, by being critical to opinionated news, consumed cross-ideological sources, and fact-checking (Wagner & Boczkowski, 2019). By doing more research to address this gap, this inconsistency of findings needs to be further explained.
During the 2019 Presidential Election in Indonesia, various information about the two candidates, namely, Joko Widodo and Prabowo Subianto, have circulated. Many misinformation spread and openly attacked both of them, either about their personality or program and policy. One of the misinformation that has circulated was the rumor of the hundreds of thousands of illegal migrant workers from China who seized the domestic labor market. These rumors circulated widely and caused considerable unrest to make the Ministry of Manpower under the Widodo administration issued a rebuttal to the rumors. This rumor was considered as one of the attacks on incumbent Widodo.
Misinformation in the form of negative issues circulated in social media is thought so far to reduce the likelihood of voters to choose the candidates who were framed negatively. As shown in the results of research by a number of public opinion survey institutions, support for the two candidates however has not changed much in recent months (e.g., Indikator Politik Indonesia, 2019). This gave rise to the question about the relationship in misinformation between social media access, political support, and belief.
This study aims (1) to determine the relationship between frequency of accessing political news on social media and the belief in misinformation during the 2019 Presidential Election, (2) to reveal the relationship between partisan bias and the belief in misinformation, and (3) to understand the interaction between the social media access and partisan bias in prediction belief in the information The term misinformation refers to information that is inaccurate, erroneous, or even false, which is known and considered valid by individuals, but then the information is rectified because of such inaccuracies (Lewandowsky et al., 2012). The term misinformation is closely related to the term fake news, which is defined as "news articles that are intentionally and verifiably fall, and could mislead readers" (Allcott & Gentzkow, 2017, p. 213).
There are six types of fake news, namely, news satire, news parody, fabrication, manipulation, advertising, and propaganda (Tandoc Jr, Lim, & Ling, 2018). All of them could lead to bias and false judgment like the effect of misinformation.
Misinformed is different from uninformed. Uninformed means not having confidence in the correct answer of a factual question, whereas misinformed means having wrong beliefs or not supported by factual clues to the December 2020 ½Vol. 24 ½ No. 2 answer (Kuklinski, Quirk, Jerit, Schwieder, & Rich, 2000). Individuals in misinformation believe information that they know is something definite even if they are wrong and consider themselves to know a fact (Kuklinski et al., 2000;Nyhan & Reifler, 2010;Pasek, Sood, & Krosnick, 2015). When it is contrary to one's belief, the retraction of the misinformation was not effective (Ecker & Ang, 2018). This distinguishes misinformation from ignorance, which is basically a lack of knowledge or information. Because it makes people to decide based on false information, misinformation is seen as a problem in democracy and public policy because the decisions taken do not give the best results for them (Kuklinski et al., 2000).
Misinformation can be disseminated without the intention to mislead, for example, a disaster whose information is still being updated by the authorities (Lewandowsky et al., 2012;Wu et al., 2017). Also, because of the emotional content in the misinformation, misinformation can also be in the form of rumors that are unclear and spread by ordinary citizens (Berger, 2011). However, misinformation can also be deliberately disseminated by interested sources, such as governments and politicians in the misinformation about ownership of WMD in Iraq that the US government disseminates (Arsenault & Castells, 2006), or interest groups in the information and health disseminated by business groups (Oreskes & Conway, 2010;Jacques, Dunlap, & Freeman, 2008).
The relationships of partisan attitude and the tendency to show motivated cognition and thus the reluctance to change belief when received retracted misinformation have been the focus of recent studies. Among them was the meta-analysis from Jost et al. (2003) that confirmed that many several psychological variables such as death anxiety, dogmatism-intolerance of ambiguity, and needs for order predicted political conservatism. More recent studies, however, showed that conservatives and liberals could show motivated cognition. Kahan (2013) found that conservatives did no better or worse than liberals on the information-processing measurement test associated with cognitive bias. Moreover, conservatives and liberals showed bias in the face of scientific information that did not match with their prior belief (Nisbet, Cooper, & Garret, 2015). Motivated cognition also has been studied as the source of bias in the context of electoral politics. Support for parties (i.e., Republicans and Democrats) or support for one presidential candidate, for example, is found to be one of the factors that influence perceptual bias (Bartels, 2002), information processing bias (Taber & Lodge, 2006), or become a shortcut to evaluate information (Swire, Berinsky, Lewandowsky, & Ecker, 2017).
The Internet and social media are related to the belief in misinformation. Belief in misinformation occurs because social media provides users with a broader opportunity to access content that is in accordance with individual attitudes and beliefs so as to create their own echo chamber . This is confirmed by the results of Garrett, Weeks, and Neo (2016) studies, which show that both Republicans and Democrats who often access information from partisan sources of ideology each tend to be more convinced of misinformation even though they have received relevant evidence about the President Obama's birthplace and the existence of WMD in Iraq. This is possible because of the presence of the internet and social media that make individuals only access news that is in accordance with their beliefs, creating their own bubble filters (Pariser, 2011). Particularly after the 2016 elections in the United States, the spread of misinformation or fake news on social media has raised concerns in recent years. In 2016 US Presidential Election, it was suspected that massive misinformation or news were spread (Allcott & Gentzkow, 2017) and accessed through social media (Gottfried & Shearer, 2016). Because it can be forwarded directly by users without going through the fact-checking process by third parties, the spread of misinformation was possible (Allcott & Gentzkow, 2017). Not only its accessibility but also the misinformation or fake news could be widely circulated in social media because its main character which used personally and emotionally targeted news or referred to as empathic media (Bakir & McStay, 2017). The misinformation was then followed by Trump's victory in the Presidential Election, which sparked concern for researchers from various backgrounds, such as psychology, economics, political science, communication, and computer science to further examine it and then give recommendations on how to intervene (Alcott & Gentzkow, 2017; Thorson, 2016).
Facebook is the most widely used platform to spread misinformation in the 2016 US Election (Silverman, 2016), although many social media platforms are the place for the circulation of misinformation. Since the 2016 elections in the United States, the results of research on misinformation and Facebook's role in spreading misinformation then encouraged Facebook to reduce the spread of misinformation (Allcott, Gentzkow, & Yu, 2019).

Halida 157
Makara Hubs-Asia December 2020 ½Vol. 24 ½ No. 2 One of the big cases in Indonesia that showed the role of social media in spreading misinformation and influencing political choices was the Election of Governor of Jakarta 2017. Viral circulation ahead of the election spread an edited video about the incumbent speech Basuki Tjahaja Purnama (Ahok) described as insulting Islam. There are still many people who believe in the edited version of the video and do not choose Ahok, although there have been corrections to the video and Ahok's apology to Muslims. This shows that political misinformation circulated on social media could be impactful during the election.
Facebook is one of the social network sites (SNS) that is widely used by many Indonesians. Partisan bias, or the difference in belief in the same information between political supporters, is one of the factors found to influence belief in misinformation. Partisan bias in the United States is measured by the apparent difference between Democrats and Republicans in addressing various facts and misinformation, for example, the split between Democratic and Republican supporters regarding ownership of WMD by Iraq (Bullock, 2009 1990, 1991, and 1992(Bartels, 2002. This difference indicates the role of partisan bias in the perception of candidates and political events. It is important to know the extent to which this bias occurs based on research on the influence of partisan attitudes on belief in misinformation in various contexts, especially in the political context in Indonesia during the 2019 Presidential Election when voter polarization is supported by presidential candidates. Particularly in the 2014 presidential election in Indonesia, misinformation-or what is known as a hoax-in elections began to bloom. It began with the publication of the Obor Rakyat Tabloid in May 2014 with the title "Puppet president" with caricatures of presidential candidate Widodo kissing the hand of PDIP president Megawati Soekarnoputri, and in June 2014, the second edition was published with the title "1001 Widodo Imaging Mask." The tabloid is known to be distributed to Islamic boarding schools and Islamic schools in Central and East Java. In the tabloid, which was later found to be a fake address, Widodo was reported as a Chinese, non-Muslim, and a foreign agent and PKI activist (Albanna, 2019;tempo.co 2018;Sufa & Anam, 2014). While the attack on Subianto by using misinformation also occurred with the circulation of the Indonesia Barokah Tabloid ahead of the 2019 Presidential Election entitled "Reunion 212: Interest of the People or Political Interest?" In the tabloid, distributed in several areas in West Java to East Java, Subianto was written in an article entitled "Prabowo Angry Media Divided" in the Main Report and the article "Deceiving the Public for Political Victory" as Special Coverage (Azanella, 2019;Nathaniel, 2019).
Quite a number of people believe that Widodo is of Chinese descent and non-Muslim as false news was first disseminated through the Obor Rakyat, although cases of misinformation from the two tabloids have been reported, and the head of the Obor Rakyat has been found guilty of spreading false news. This fact was reported in a national survey release held by Indikator Politik Indonesia on January 8, 2019. In this release, it was found that 20% of respondents learned that Widodo's parents were Christians, and among those who knew the news, as many as 20% believed in the news that. Likewise, with the misinformation about Widodo of Chinese ethnicity, 23% of respondents knew the news, and from that number 24% believed in the news.
Misinformation ahead of the presidential election also contained policy content and cornered the government or

Methods
This research used nationally representative data of Indonesian voters to test the three hypotheses. As much as 1,818 participants were interviewed for this research.
The survey population was all Indonesian citizens from 34 provinces who had voting rights, namely, those aged 17 years above or married when the survey was conducted. The participants selected by the multi-stage random sampling method with the margin of error at ±2.34%, assuming simple random sampling at a 95% confidence level. By returning to selected respondents, quality control is carried out on 20% of the total respondents in each survey, and the results found no significant errors. Access to political news on social media focuses on Facebook by asking "In the past month, how often did you get news related to social, political and government issues at the regional or national level through the following application?" With a 6-point Likert-type scale (1 = don't have account, 6 = every day/almost every day). In data analysis, mean-centered has been done on these data. This research focuses on people's consumption of Facebook because of the extensive use of Facebook compared with other SNS globally and nationally although the survey questions asked several SNS.
Partisan attitudes toward presidential candidates were measured by two different questions for Joko Widodo and Subianto. On a scale of 0-10 (0 = Will not vote for, 10 = Will vote for), participants were asked how much the possibility of choosing presidential candidate number 01 Joko Widodo (KH Ma'ruf Amin) or number 02 Prabowo Subianto (Sandiaga Uno).
The scores obtained are then analyzed using the Binomial Logistic Regression model because the independent variable is a nominal scale.
A number of variables are controlled, namely, demographic questions that included age, gender, urbanrural area, and education. Three variables are made dummy variables because it is a nominal scale, namely, urban-rural variables (1 = rural, 0 = urban), gender (1 = female, 0 = male), and education (1 = junior high and below 0 = senior high school and above).

Results
Of the 1,818 data entered, as many as 804 were processed further, namely, data from participants who answered "know" information about the existence of illegal migrant workers from China and not missing in question about their belief in misinformation.  (3), analysis of support for Widodo, and support for Subianto; Model (4), an analysis of the interactions of each between the frequency of political news access through Facebook with support for Widodo and support for Subianto. The summary of the analysis is shown in Table 2.   In Model (1), the age variable significantly predicts belief in misinformation about illegal migrant workers from China in a negative direction (B = −0.02, SE = 0.01, Exp(B) = 0.98, p = 0.00). This means that the younger the participants tend to have a stronger belief in misinformation about illegal migrant worker from China. The odds of the belief in misinformation decreases by a factor of 0.98 for every unit increase in the age variable.
Model (1) can be used to explain the belief in the misinformation about illegal migrant worker from China, χ²(4, N = 804) = 15.57, p = 0.00, but the pseudo R² value = 0.03 or explain only 3% of the variance in belief in misinformation.
The results of the analysis in Model (2)

Discussion
This research shows that political support significantly affects beliefs about misinformation. Voters, in the 2019 Presidential Election which was the context of this study, who supported Widodo significantly did not believe in misinformation about the thousands of illegal migrant workers from China in Indonesia. Conversely, voters who supported Subianto significantly believed in the misinformation.
This is in line with research on the influence of partisan bias on belief in misinformation in the context of US politics that categorizes voters based on their party identification, namely, the Republic versus Democrats. Republicans and Democrats have different beliefs in misinformation on various issues, such as WMD ownership in Iraq, the birthplace of Barack Obama, and the evaluation of government policies. The category of voters in Indonesia is not based on the party identification but based on support for presidential candidates in the December 2020 ½Vol. 24 ½ No. 2 2019 Presidential Election. The finding from this research shows that partisan bias, either based on political parties or support for candidates, can have the same effect for belief in misinformation. In perceptions that form a belief in misinformation, political support can cause bias. This is also in line with the findings of motivated cognition, which shows that individuals tend to believe (or not believe) in the information that supports (or does not support) their opinions, even though the information has been retracted. Each camp, both Widodo and Subianto supporters were motivated to believe or not believe in the misinformation. Widodo's supporters in this case did not believe in the misinformation of thousands of illegal migrant workers from China in Indonesia, and conversely, by Subianto's supporters. This finding shows that in the context of political support based on the choice of presidential candidates, belief in misinformation also occurs, as is party-based polarization.
This study however shows that the frequency of accessing political news on Facebook does not significantly influence belief in misinformation. Likewise, the interaction between political support and the frequency of accessing political news on Facebook do not significantly influence belief in misinformation. This finding is different from the findings of Garrett, Weeks, and Neo (2016) that confirmed the influence of social media usage and partisan bias on belief in misinformation. This contradiction brought questions about the role of social media (e.g., Facebook) on belief in misinformation and the measurement issue. Based on a study from Guess, Nyhan, and Reifler (2018) about selective exposure using pre-election survey responses and web traffic data, Facebook was the most important factor that facilitates the spread of fake news, but at the end, they who finally visited and heavily consumed fake news website came only from the small proportion of people-10% of Americans with the most conservative information consumption pattern. This means that social media consumption per se could not be used to predict the belief in misinformation. On the contrary, people selectively chose fake news media could be because of their partisan attitude. And this brings partisan bias as the central point in the study about belief in misinformation and media.
Further research should elaborate on the concept and methods of the attitudinal and behavioral mechanism of the role of partisan bias and media on belief in misinformation. Guess et al. (2018) stated that research related to social media need to measure real behavior to capture people's media behavior and its effects. Moreover, further research should also consider to analyze how multiple media usage might relate people's belief in misinformation. This research only focused on the support of candidates as an indicator of partisan bias in the Indonesian election. Further research should also study not only support for a candidate but political ideology and personality as variables that could also affect belief in misinformation. It will not be easy to define political ideology in Indonesia because there are many political parties, but study about Indonesia political ideology usually capture the different attitude between secularists and religious people (e.g., Mujani, Liddle, & Ambardi, 2018;Pepinsky, Liddle, & Mujani, 2018). This category could also apply in the study about belief in misinformation.
The scope of the misinformation should also be considered. This study only focused on one misinformation that heavily circulated among people. Further research needs to study many kinds of misinformation to understand, which issues will be most believed as a function of partisan bias and social media consumption. The kind of social media consumed could also influence the relations between partisan bias and belief in misinformation because social media have their own unique feature and serve for a different purpose. Further research should consider to study the influence of diverse media consumption (i.e., different social media, text messaging [i.e., WhatsApp and Line], or social and mass media) in this context. This suggestion is in line with Dubois and Blank (2018) who found that echo chamber has been lessened among people who were interested in politics and those with diverse media diet. Those variables in further research will be valuable for us to understand more about the interplay between partisan bias, social media behavior, and belief in misinformation.

Conclusion
This study shows that partisan bias significantly influences belief in misinformation, although the use of social media does not significantly influence this belief misinformation. This finding opens the opportunity for further research regarding belief in misinformation and how far the social media influences it, especially in the political context in Indonesia.