Legendary tech journalist John Markoff (CASBS fellow, 2017-18) chats with 2023-24 CASBS fellow Young Mie Kim on her groundbreaking efforts to identify how shadowy groups use algorithms and targeted disinformation campaigns during presidential election cycles; measure their real-world distorting effects on voter mobilization or suppression; and illuminate our understanding of resulting political inequalities and their implications for American democracy.
Legendary tech journalist John Markoff (CASBS fellow, 2017-18) chats with 2023-24 CASBS fellow Young Mie Kim on her groundbreaking efforts to identify how shadowy groups use algorithms and targeted disinformation campaigns during presidential election cycles; measure their real-world distorting effects on voter mobilization or suppression; and illuminate our understanding of resulting political inequalities and their implications for American democracy.
YOUNG MIE KIM: CASBS bio | Univ. of Wisconsin faculty page | "The Disinformation Detective" (On Wisconsin magazine) |
Kim leads Project DATA (Digital Ad Tracking & Analysis) at UW. | Project DATA on X |
Kim is lead author of the article "The Stealth Media? Groups and Targets Behind Divisive Issue Campaigns on Facebook," Political Communication, v35 n4 (2018). The article won the Kaid-Sanders Award for the Best Political Communication Article of the Year by the International Communication Association.
Coverage of findings: The New York Times here and here | Wired |
Kim's testimony delivered to the Federal Election Commission |
Kim is a founding member of the International Panel on the Information Environment. Coverage of IPIE in The New York Times |
Kim among the authors of "The effects of Facebook and Instagram on the 2020 election: A deactivation experiment," Proceedings of the National Academies of Science, v121 n2 (2024) |
Kim a coauthor of several articles appearing in a special issue of Science on Social Media and Elections (2023) |
At the beginning of the episode, Kim discusses the influence of Phil Converse. Converse was a CASBS fellow in 1979-80 and later served as CASBS director (1989-94). Learn more about Converse's work.
---------
Read John Markoff's latest book, Whole Earth: The Many Lives of Stewart Brand (Penguin Random House, 2022)
Narrator: From the Center for Advanced Study in the Behavioral Sciences at Stanford University, this is Human Centered.
Divisive, digital, micro-targeted disinformation efforts, especially those that are undisclosed or dark, exert a measurable effect on voter turnout and political participation. But how do we uncover this if the alleged activities are hidden? In part, it's through innovative tools and methods devised by scholar Young Mie Kim and her collaborators that reverse-engineer the algorithms behind disinformation ads to reveal the evidence.
Today on Human Centered, A Conversation with Young Mie Kim, a 2023-24 CASBS Fellow and Professor of Mass Communication at the University of Wisconsin. Kim led a project that used a novel real-time ad exposure tracking tool to empirically investigate the sources and sponsors behind targeted dark ad practices during presidential election campaigns. She then mapped these findings onto the real-world political behavior of users to estimate the extent of voter mobilization or suppression.
Her work won an award and garnered extensive media coverage that we'll link to in the episode notes. And she'll walk us through that work in conversation with renowned former New York Times tech journalist and Pulitzer Prize winner John Markoff, a 2017-18 CASBIS Fellow and a familiar voice to us here on Human Centered. As you'll hear, John also interrogates Young Mie's work during her CASBIS year and beyond, which seeks to push the frontiers of her previous groundbreaking findings.
The goal is to elucidate how algorithms and the data underlying them influence the distribution of political information and representation, how they intensify political inequalities by disproportionately impacting vulnerable or underserved populations, how they amplify subtle yet extremist voices, and how they exploit fundamental structural problems of our information environment and thus degrade the integrity of democratic systems and the media. Needless to say, that's a lot, even without quantifying any Russian involvement. But yeah, you'll hear about that too.
So let's listen.
John Markoff: You know, there's this tradition at CASBS that I think that you're familiar with about the ghosts in your study and the people who occupied your study before. And you mentioned in a note that Phil Converse had been in your study some years before you were. And I was wondering if you could talk just a little bit about his work, how it's related to your work.
Young Mie Kim: Sure, sure. So Phil Converse is a political scientist who studied public opinion. And his argument is like a survey researchers.
And then his argument is that people like, you know, mass attitudes are not stable. So people answered survey questions on top of the head. So their attitudes, opinions are not consistent over time, not stable, not so strong.
So that's about the entire book. But then like last chapter, just one paragraph. By the way, there are certain segments of the population who seem to have a strong, consistent, and stable attitudes.
And then probably because they are caring about some issues because of their personal importance. Let's call them issue publics. So this concept of issue publics are sort of an auxiliary concept, meaning that he accidentally found it and then tried to explain the puzzles that he couldn't explain based on his original theory.
So that is like a long time ago, like the 60s, and then he stopped there. There are some people like John Krasnyk, also former CASBS fellow, tried to identify issue publics by using a survey method. So how to measure personal importance.
I read these John Krasnyk and then Philip Converse works when I was in graduate study, and then I was just fascinated by the concepts. I think that as a digital adult person, in the digital media information environment, which is very pluralistic, there is no amorphous, degenerate public who could react to the media messages or information environment in a very uniform way. So I assumed that maybe this is the time that issue publics are more activated and facilitated.
So I tried to revisit field-converse notion of issue publics and then develop this concept further in the data-driven information environment.
John Markoff: Speak about that a little bit in the context of an audience or a mass, which is of that era. But it seems like even before the internet, I mean, internet makes possible micro-targeting and precision, but you could walk door to door to people's homes in the past and contact individuals and probably... Where are the things similar and where are they different?
Is the difference that you were no longer in this era of treating the audience as a mass?
Young Mie Kim: Yeah, so it's not that issue publics are suddenly created by the information environment. Yes, like you said, issue publics always existed. But what I emphasize is that instead of conceptualizing the public or mass as a more from a single entity, one entity, let's see that society as the one that consists of a pluralistic groups of people who have a strong interest in particular types of issues because of their values or because of their identities or because of their self-interest.
For example, gone owners are interested in gone rights issues because of a self-interest and its implications for their daily lives. Some people care about abortion because of a religious values. Some people care about racial issues or racial conflict because of their racial identities.
In the past, as I say, like a broadcast era when broadcast media was a dominant source of information, these people kept not determining their own agenda. Their information-seeking behavior was very limited. Also, their organization ability, organizing people who share the same interests was also limited, probably just a local community, limited to a local community.
But now with the Internet, they can find the information they care about even though it was not a national agenda, it could be a journalist emphasize or politicians can emphasize. And then they can self-organize themselves on the Internet. But then the big shift, I think, the more fundamental shift, I personally think that it is micro-targeting.
The ability to identify, target, and mobilize or even demobilize certain groups of people. And then that's definitely like a dramatic shift from targeting based on arbitrary groups to targeting and messaging and customizing messaging and then media placement based on basically individual. Yeah, it is essentially possible to target, identify and target each and every individual in the United States.
John Markoff: So let's talk a little bit about the players in this new information environment that you've defined here. And clearly, you can see the political parties, the candidates themselves and the candidates' organizations. There are foreign actors who now can access this.
And they can all use these, they now have new levers, you're arguing, to get at issue publics with great precision and perhaps effectiveness.
Young Mie Kim: That's right, that's exactly my argument.
John Markoff: And how democratic or anti-democratic is that?
Young Mie Kim: I mean, like a micro-targeting per se, it's hard to say whether it is democratic or anti-democratic in a binary way, in a dichotomous way.
I just see it as sort of a design. So yes, it is a fundamental technological attribute that explains the characteristics of the data-driven information environment. Everything is based on the data, and then the strategies or technologies like microtargeting technologies enable anyone to identify, target, and immobilize or demobilize people.
But then that is, who has data and who has microtargeting capability? That's the issue that we need to think about. So some people argue that microtargeting is basically like the core idea is that everyone receives or everyone exposed to different messages.
So some people argue it could be very democratic or very anti-democratic depending on who are using it and how.
John Markoff: So, transparency is important in terms of this question of democratic versus?
Young Mie Kim: Right, but even before that, I want to emphasize that. So then who has the data? There is definitely information asymmetry.
We don't have a lot of data about other people. But certain campaigns have a lot of data about the entire population. So we have to take into account that kind of thing.
So then to make it more democratic, then I think transparency is important. Who are using micro-targeting capability technologies? And then who are targeting with what kind of a message is?
Who are trying to influence whom? I think that is really important to understand the implications of this data-driven information environment.
John Markoff: Before you came to CASBIS, you had two election cycles to look at and see the behavior that was going on in this internet world. Let me start by asking you what your charter was that you set yourself off when you came here. What questions did you want to address in your research during your CASBIS year?
Young Mie Kim: Yeah, so viewing micro-targeting as a fundamental and distinctive attributes of the data-driven information environment, current information environment, I wanted to study basically who are using micro-targeting and who are targeted with what kind of messages and what's the effect of micro-targeting, how much is effective in mobilizing people in terms of elections, voter turnout, or whether that actually influences election outcomes. Because I've been studying this concept of issue publics, I've known a lot of advocacy groups and with the micro-targeting available on all the social media with a really convenience menu, for example, Facebook provides all these ads, targeting tools, and so even people who don't understand marketing or targeting strategies, they can just put keywords like con owners, Second Amendment, and then they can identify people who are interested in God or things like that. And then they can narrow down to a very neighborhood level, like a community.
So it's not like a broadcast media. It's all like your campaign messages are tethered by this arbitrary media market. In the United States, we have 210 media markets.
But now, you can technically identify and target every and each individual. So what's the implications for democracy? So that was my original question.
But then I ran into a big challenge that I didn't think about. Well, first of all, I thought that I'm going to capture all the digital campaign messages. So basically, I get ads.
But then I realized that, all right, all these targeted ads are only shown to the people who are targeted. So basically, if you're not targeted and then you don't see it, so there is no public archive. So it's nearly impossible to observe who are receiving digital ads and what kind of messages these ads are about.
So I was literally banging my head to the wall, and then I had an aha moment because, well, there are ad blockers. So instead of blocking the ads, I could use the same code. So instead of blocking the ads, I could just capture the ads and then transfer the ads with the meta information, such as a landing page, URLs, and timestamp, and then transfer everything to my research server.
So that was the idea.
John Markoff: And so this is eScope?
Young Mie Kim: Yes.
John Markoff: Did you do this as a browser extension? What was the final model?
Young Mie Kim: It's like a typical browsing extension, collect the click streams. Whenever people click something, it captured the URLs. So this one captures, this one is more like an ad blocker.
So it captures click streams, but on top of that, it captures all the ads that are shown to people. So we actually know who are exposed to what message, which ad, at what time point.
So that is sort of revolutionary, as still there are not many studies who are using this direct exposure tracking method.
John Markoff: And does it require volunteer participation? You have to convince people to use this in their day-to-day surfing activities?
Young Mie Kim: Right, right. That was actually the most challenging part. After we figured out how we're going to design the app, the next challenge was how to recruit people to use the app. The micro-targeting basically means that you don't do a broad reach strategy. You send a different message to different people.
So to understand the whole scope of the data information environment, what kind of messages are out there, you need a really large sample. So people receive different messages for different reasons. Because you are a woman, because you are a man, because you are African Americans, things like that.
So you need to have the sample that represents the voting age population. Therefore, you can sort of generalize whatever findings I got from this data. So I worked with a firm that specialized in survey recruitment and sampling.
So we ended up recruiting 17,500 people who represented the US voting age population.
John Markoff: Did e-scope lead you to this, you call them dark posts, did it lead you to the dark posts?
Young Mie Kim: So digital ads are essentially a targeted ad. So, for example, I saw a lot of anti-Muslim ads from a certain nonprofit organization in the United States. When I visited their Facebook page, never seen those kind of anti-Muslim messages on their Facebook page.
So publicly, this organization is not an anti-Muslim organization. But the data we collected, this particular organization sent out all anti-Muslim messages. So this is why digital ads got the nickname called Dark Posts.
John Markoff: And to what extent did the changes in the Federal Election Commission rules about digital advertising change the environment of Dark Posts? Are there still Dark Posts?
Young Mie Kim: Of course, yes.
John Markoff: I thought there was a disclaimer law that didn't.
Young Mie Kim: Oh, yes, a disclaimer on so-called express advocacy ads, that passed relatively recently. Yeah, so I consider it as a sort of a victory of this kind of discovery, like my research. So I actually testified at the FEC in 2018 about online disclaimables.
John Markoff: And it took them four years to…
Young Mie Kim: Right, finally. So, but it is, yeah, so that's a kind of a victory. So some political ads, like defined as, you know, as defined as a political ads, need to put like a paid for by information.
John Markoff: Where are there still loopholes?
Young Mie Kim: Loopholes is like a political, it is like a hard to monitor this. So, well, so now we have, I guess, some disclaimer rules. Somebody have to find, like, who didn't do, they get disclaimers, didn't put like a disclaimers or disclose themselves.
So that is still who's going to monitor it. That's a big issue. And another thing is, you know, AI disclaimer, disclosure.
At the federal level, we haven't had anything yet.
John Markoff: There is nothing yet.
Young Mie Kim: No, nothing at FEC either. And well, at the state level, 14, sorry, 40 states are considering AI disclaimer rules, and I think so far, 11 states enacted ahead of the election. Back to just a general political ads rule, online political ads rule, I worked with think tanks who specialize in election law, and particularly, I helped them write a bill like Honest to Ad Act.
That is bipartisan act that still has not enacted yet. So no federal level legislation on online political ads. So every C-level rules are implemented, but no law adequately addresses.
John Markoff: So the information environment, this is still going on. The kind of stuff you saw in 2016, the world hasn't changed completely.
Young Mie Kim: 2016, 2020, yeah.
John Markoff: In reading your 2016 research, it seemed that the significance of voter suppression, the effectiveness of these stuff is perhaps more in voter suppression than encouraging people in terms of changing outcomes in elections. So that's the real sort of dark threat to democracy. Did I take away the…
Young Mie Kim: Yeah, so that is something that I want to emphasize a lot. So when people think about social media propaganda or disinformation, the first thing people think about is it's all false information.
In fact, outright lies are only a few. It's all believable information or misleading. So I would define disinformation as not as false information, or intentionally false, intentionally misleading information, but more of a function of misleading.
Information that has the function of misleading. So for example, neither candidates support African Americans. We don't know that outright false or not, but it could lead people to believe that election doesn't matter.
So in fact, a lot of voter suppression as are emphasizing that election doesn't matter. Neither candidate's case cares about us. Your vote doesn't count.
So that kind of voter suppression could be really effective when it was targeted at the people who are already have some barriers or constraints to turn out to vote. For example, we have a cherry man ring, we have a voter ID laws, and then some people can't not take day off on the election day. And then when these people receive this kind of ads, it could be very effective.
So that's what I studied.
John Markoff: And you found some evidence that it did have an impact.
Young Mie Kim: Yes. This is one of the coolest studies that I'm very proud of. So what we did is with the e-scope, we collected all this, basically, I could directly measure exposure to all the ads.
And then so that data was combined with weekly surveys of the people who are using e-scope. So by combining their survey responses to survey responses like with the ads they're exposed to, because I know like if we're exposed to which ads, I know for respond to surveys with the particular responses, I was able to so-called like a reverse engineer, like what are the targeting profiles of a certain ad. So let's say I identified like a voter suppression ads based on this like a frequent terms, like a boycott the election, don't go to the polls, election doesn't matter, neither candidates care about us.
And then I look at, I identify who are exposed to these ads. And then by looking at the survey responses, aha, they are non-white voters who are living in minority, majority counties in better ground state. So we found that almost like a, you know, people who are live, non-white voters who are living in minority, racially minority majority counties in a better ground state received 10 times more voter suppression ads than the counter part.
John Markoff: Let me just insert a question here because I found that very compelling. And at the same time, it reminded me of this BBC, I think it was Channel 4 documentary. Are you familiar with this?
Young Mie Kim: Yeah, I talked to these people.
John Markoff: They went to this precinct and they walked around. So let's see, as I remember it, they basically tried to find somebody who had seen a particular Facebook ad and they couldn't find anybody in this district who had seen the ad. They were looking for evidence of voter suppression and they weren't able to find it.
Young Mie Kim: It was hard to find them because, like I said, nobody is monitoring. Actually, those reporters reached out to me. I know the characteristics, who are exposed to it.
So my research provided some kind of evidence that their effort is not worthless. Well, it's like in their case, like in Florida and Georgia. So we found like targeting patterns.
And then like the next step I took was to track all of this, like East Cove users' actual turnout record. And then I identified all the people who are exposed to voter suppression at, and then I identified all the people who are not exposed to voter suppression at. And then we found that voter suppression exposure did decrease the turnout about like 2%.
John Markoff: Which in some cases was larger than the difference in the vote.
Young Mie Kim: Right, like when you think of this is like an average population effect when we aggregated across like all like, you know, people. But if we focus on like people, the subgroups who are especially targeted, like non-white voters, like who are living in a minority counties in better ground states, the difference between this group and the other group, like people who are not exposed to voter suppression as white voters living in non-minority counties in non-better ground states, the difference between the two groups are huge.
John Markoff: What's fascinating to me about this discovery, the narrative, I mean, sometime later you discovered, because of the House Intelligence Committee, that the IRA, the Internet Research Agency, was playing a big role here. What I was super intrigued by is you were coming face to face with this algorithm that they were running basically. And did you have a sense that on the other side of the wall, that there were social scientists who were adversarial, who were, I mean, did you have a sense that this was being run by people who were sophisticated social scientists, which I hadn't seen at the IRA.
I don't know who was there, but do you have a sense of who the enemy was in this case?
Young Mie Kim: Well, no, like when I first opened the data, I was actually shocked because remember, my original research question was actually about, you know, broader implications of micro-targeting, especially in terms of inequality in political involvement. And so I presume that I would observe a lot, obviously, you know, this is election campaign season, so a lot of campaigns by candidates. And then probably a lot of campaigns sponsored by big advocacy groups like NRA, NARAL, things like that.
Yeah, they are there. But what really, what I was shocked by was that, you know, they are unknown. The majority of the sponsors are unknown.
And then I've been studying this for quite a long time. But even to me, I didn't recognize all these names of advocacy groups or crass organizations who are clearly doing a lot of large-scale digital campaigns. I mean, I could just jokingly, you know, so back then it was that there was some kind of rumor that Russians did interfere in the election and social media.
I was jokingly saying, well, maybe these are Russians, but I actually never believed that. Those are Russians. But when I, when the House Intelligence Committee, almost one year later, they released some part of the data with the meta-information, I matched that information, House Intelligence, with my data, and then they're exactly the same.
So the groups that I set aside as suspicious, undisclosed groups, about 17% of them turned out to be Russian groups. And then they are, people say, some people who study Russian propaganda say they are not so sophisticated. Sometimes they make a grammatical error.
But I thought they had a pretty good understanding of American society because they are clearly exploiting the deep divide, deep social and political divide in our society. So they are not like a creating division, but they are taking advantage of the divide, mistrust in the government, media, mainstream media, or just like a general distrust people, especially people who have a different background.
John Markoff: In the seminar, you briefly brought up this concept of algorithmic opportunity. How does that fit into this?
Young Mie Kim: So I used algorithmic opportunity to emphasize structural and systematic aspect.
After all, I can analyze the data and then finding the information environment. My conclusion is how the data-driven information environment propels inequality in political environment.
Long Island overpasses just occurred to me. So Robert Moses, who was an architect in early 1920s, he designed the whole overpass in Long Island. He designed overpasses so low that mass transportation buses couldn't pass through the overpasses.
If you pass through the overpasses, then there is John's Beach Park. So it is a public park, but because only those who could afford the cars could pass overpasses and then could enjoy the John's Beach Park. So even if it is a public park, it is defined in a completely different way.
The meaning of a public has changed to people who could afford the cars. So I think this algorithmic opportunity is the same kind of thing. So I call it algorithmic overpass.
So it just inadvertently propels or maybe not inadvertently. Some people have a deliberate intention to some campaigns. So if we think about this political inequality, inequality means unequal distribution of political information and unequal representation in political participation.
Previous researchers identified the root causes for this unequal distribution of political information and unequal representation in political participation as motivations and an ability. So people who do not have ability to understand the complexity of politics, that's why there is an inequality happens, or people do not have motivation, they're not interested in politics. But these are all individual level factors, I would say.
I pay more attention to systemic, structural, obviously technological factors. So I view information environment, so instead of just looking at one technology, like digital or broadcast media, conceptualize the whole technology as information ecology or information system or information environment, then viewing that as a more structural and systemic issue. I would say the information environment sets a certain condition for amplifying some activities, political behavior, or limiting some political behavior, so like mobilizing and demobilization.
So I see it as an opportunity, so some people have a lot of opportunity to consume mobilizing political information, get out the vote, or learning about all kinds of political issues. But because of this targeting ability, some people never get it received, never get exposed to certain types of information. So some of the studies I did before had some empirical evidence that, for example, Hispanic Latino voters spend a lot of time online relative to other racial ethnic groups.
However, they do not get a lot of political information. Younger voters, relatively speaking, they receive less political information. First time voters were immigrants, they received less political information.
Why is that? The first step of the stage where discrepancies happen is the data these campaigners use. So the baseline data campaigners use, no matter what kind of campaigns they do, is photo files, so the history of turnout.
And that has actually the strongest predictive power. So people who turned out to vote last election is more likely to turn out to vote this election, right? So basically, younger voters have a relatively shorter history of voting, and therefore they generate less data points, therefore they generate less predictive power.
So they are filtered out. Same thing with immigrants. Then the second stage would be microtargeting.
So the microtargeting is that campaigners have a lot of information, and then they identify a target who's going to demobilize. They determine this information asymmetry, and then they always have their interests. And as I explained earlier, why do they do voter suppression campaigns?
Because it's much easier to do that compared to persuading people or persuading people who are not so much interested in politics or converting people from one side to the other side. Targeted people who already are resentful about election and who already have barriers to turn out to vote and just to say, just to nudge them, just to confirm their belief that election doesn't matter. So we'll see this kind of unequal representation.
Other studies like I did in 2020, we happened to capture the 2006 insurrection. And then these undisclosed campaigns, targeted conspiracy groups, they are generalized conspiracy groups. They are not particularly political.
But then I talked about election and then connects these election outcomes with their general conspirational beliefs about society. So then these people are heavily mobilized for stop-the-steal, quote-unquote, movement. And then they are organized for insurrections.
So if we think about this pattern from a bird-eye view, so you see as a result of micro-targeting, the data-driven information environment might propel inequality in political involvement by amplifying extremist voices. Therefore, they are overrepresented. But by limiting already marginalized voters, therefore these marginalized voters are underrepresented.
So the gap between these two groups are even wider.
John Markoff: It seems like you've identified the flip side of Citizens United in a strange way. I don't know. That's my...
Young Mie Kim: Exactly, yeah. So Citizens United, I think, you know, Supreme Court judges didn't think about the implications of the future information environment. So it inadvertently opened the door to 14 elections and it facilitates this undisclosed campaign.
And then, you know, information environment provides some venue where undisclosed campaigns could flourish.
John Markoff: Thank you. This was really illuminating for me, so I appreciate it.
Young Mie Kim: Thank you.
Narrator: That was Young Mie Kim in conversation with John Markoff. As always, you can follow us online or in your podcast app of choice. And if you're interested in learning more about the Center's people, projects, and rich history, you can visit our website at casbs.stanford.edu.
Until next time, from everyone at CASBIS and the Human Centered team, thanks for listening.