Virginia Eubanks
One of the things I say over and over again in the book is that systems like these don't crack down on poor and working people because they are helpless and vulnerable. They crack down on us because we're powerful, and scary, and we have the numbers.
Monica Manney
Welcome back to UVA Data Points. I'm your host, Monica Manney. Today, we're bringing you a conversation between Lane Rasperry, the Wikimedia and residents here at UVA School of Data Science. And Virginia Eubanks, an author, journalist, and associate professor of political science at the University of Albany. Recently, Virginia began that PTSD bookclub, an ongoing project that explores books about trauma and its aftermath. You can find this project, and Virginia Eubanks other projects at Virginia-Eubanks.com. This conversation was recorded back in 2019. And I know, I know that was so long ago and the world is a completely different place. But before you switch to another show, let me just say, this conversation is extremely relevant to our current moment. In fact, time has only made the conversation more immediate. In many ways, it is a discussion that looks toward the future, warning of the unintended, or at times intended, consequences of emerging technology. And it's a bit scary how much they got right. And while the conversation, lobs some heavy critiques at the tech world. It also provides many reasons for optimism. It's a fascinating and multifaceted discussion. And so with that, here's Lane Rasberry in Virginia Eubanks.
Lane Raspberry
Have you there, my name is Lane Rasberry. I'm Wikimedian in residence here at the School of Data Science at the University of Virginia. I'm here with our guest today, Virginia Eubanks, who's associate professor of political science at the University of Albany State University of New York, author of the 2011 book Digital Deadend: Fighting for Social Justice in the Information Age. Co editor of the 2014 book Ain't Gonna Let Nobody Turn Me Around: 40 Years of Movement Building with Barbara Smith. And author of the 2018 book, Automating Inequality, how high tech tools profile, police, and punish the poor. Virginia...
Virginia Eubanks
They're all a mouth full.
Lane Raspberry
Wonderful titles! And they all have in common social justice, civic issues. Can you tell me about your field of research and why all this is important to you?
Virginia Eubanks
Yeah, so that's a great and difficult place to start. So one of the things I often say about my career as a researcher, writer, as sometimes academics, sometimes journalism, sometimes journalist, is that my colleagues and my teachers have always said that I'm very difficult to discipline, meaning that I've always been very sort of interdisciplinary in the way that I, I look at work, where all of the sort of interest in this comes from is that I was a welfare rights organizer for 15 years. And much of the work that I have ended up writing about over the last 20 or so years of my life, comes from doing originally a long project of participatory action research with a group of women, group of 90 poor and working class women who lived in a residential YWCA in my hometown of Troy, New York. And so many of the things that I've ended up following are questions that we sort of developed together, in our time doing sort of activism and participatory research and organizing. So largely, I just sort of followed where my neighbors and friends and colleagues led. And I feel just incredibly lucky to, I think keep hitting on some really interesting questions about the relationship between technology and social justice and economic inequality in the United States.
Lane Raspberry
Certainly not everybody in social justice is participating in new technology in the way that you do. I find that rather uncommon, to what extent would you call yourself a data scientist or technologist?
Virginia Eubanks
Yeah, that's also a great question. So I started in this... So I sort of came of age as an activist in the Bay Area in San Francisco and Santa Cruz in the mid 90s. And this is I'm going to date myself very terribly right now. Now everyone will know exactly how old I am. I started out in this field as a contracted HTML programmer for CNET magazine in like, 1995, maybe. So I had an interview with them that literally involved the question, do you use order lists or unordered lists? And this was like a very difficult technical question at the time. And I could also make a living doing this as a contractor. So I work for CNET in a number of other places, but I was also an economic justice organizer, and it was a really complicated, interesting, troubling time to live, live in the Bay Area. So on one hand, I would hang out with the friends that I met, doing tech work as a contractor. And, you know, folks would have sort of only half joking asked if you know, had you made your first million dollars yet, like there's this very hard push. Uh, but on the other hand, at the same time, I was watching things like public housing disappear in San Francisco and the city was visibly whitening over the time that I live there. And so it was really troubling to me that there was this dialogue about how everyone was going to make a million dollars. And also we were going to solve all of the world's social justice problems. And yet at the same time, the reality that was happening outside everybody's windows was just being completely ignored. And so it really sort of forced me to kind of a crisis of conscious, consciousness. And I actually sort of fled the Bay Area and move to this small town in upstate New York, small city in upstate New York, where I've lived for the last 22 years, thinking like woo, now I've escaped that, and the year I moved, they sort of rolled out a regional economic development plan, not surprisingly, for the times is now 1997, called Tech. Tech. Oh, what was it? Think it was? It wasn't... it's called Tech Valley. And I was an... all the same narratives and all the same talk was happening. And so that's really where I started to think, okay, like, let's figure out what's happening here about this story. And that's when I started doing though the work that I talked about earlier with the women at the YWCA because I really wanted to get a sense from them, How are you experiencing this, this, this, this supposedly new world of technology? How is it really affecting your lives? How is it affecting the services you get from the government? And how is it affecting your neighborhoods? And how can we think about it better together? So yeah, I mean, that's a long answer. But I started out as a technologist, I still keep my hand in to a certain degree. But I fundamentally think about myself as someone who really cares a lot about the relationship between technology and social justice and the reality of people's lived lives.
Lane Raspberry
That's very fortunate that you were in these hot places in there, boom time and got to see them from different perspectives. Yes. Can you say something about the usefulness of being interested in technology, and journalism and social issues? And how these knowledge of any one of these can make someone better at practicing a career in any of the others?
Virginia Eubanks
Yeah, so I, I think we tend to ask that question when we're assuming that the sort of priority skills there are the tech skills, right, like. Oh, it's so important that, you know, for example, how the inside of machine learning works before you can, can say any... have any kind of meaningful critique, say, of algorithmic decision making. I'll say in my own work, that I don't find that to be true. One is that I find people in community who are seeing these tools be rolled out, who often see themselves as targets of these tools actually know a lot about them. And they may not know like, what the difference between an outcome variable and a predictive variable is or why it's important to know the weights, or like the difference between a stepwise probit regression and, you know, random tree random...
Lane Raspberry
Forest, yes.
Virginia Eubanks
(laugs) I wrote a book about it! Um, but they know a lot about the history of the use of their data. In the past, they have really good guesses about what's happening now. And often they have a lot more information about how the technology is likely to interact with their community in terms of really important things like power and privilege and money. And so, yeah, I do, I think it's important to have a set of lenses to look at this stuff from, but I also sometimes feel like from the tech side, there's often a kind of math washing that happens, which is like, we're gonna throw this sparkle dust at this new policy. And don't look to, look too closely at the cloud of sparkle dust, because like one of the things I saw over and over again, in my research for Automating Inequality, is people would say that these that the systems were sort of algorithmic end to end and over and over again, I'd find that like, one part of the algorithm was like a guy sitting in a room like poking buttons right, that there was often a human in the loop even and when these tools were as sort of advertised as...
Lane Raspberry
Fairly automated but it was a human making decisions still in the training,
Virginia Eubanks
And there's been some great work actually around that by Mary Gray and Siddharth Suri and a fantastic book that came out this year called Ghost Work, which is about the the sort of oxymorons of automations last mile, meaning the sort of invisible human work that often goes into making technology systems seem smarter than they actually are. It's a great book, everybody should read it.
Lane Raspberry
Tell me more about these people who are the subject or the targets of this automation. So you've got this book, automating inequality, you present three case studies in here, homeless services in Los Angeles, in Pennsylvania, child protection, and then Indiana Health Benefits. So there's these three systems, one might expect that these are three different sets of expertise, and that the system shouldn't have anything in common, but you made a comparison both about the systems and the people who are affected by these systems. What do these have in common? And who are these people? And what's the what's the common thread here?
Virginia Eubanks
Yeah, so I was really lucky how incredibly courageous the people who talked with me on record, right, using their their real names, their real locations, their real experiences, particularly the folks who were currently in the public assistance system in some way, either receiving medicaid or trying to access homeless services, or currently, like a parent, who's currently interacting with Child Protective Services, took... they made themselves really vulnerable, going on record with their stories, it took a really big risk, sharing their stories, and they did so really in the hope that their stories would help other people. And so I'm really honored and really privileged to have been able to talk to all the folks I did, I talked, I did over 100 interviews for this book. And though I talked to people like really, across a wide variety array of positions, right, I talked to people who are building the tools, I talked to administrators of programs, I talked to policymakers, I talked to legal aid lawyers. But in every place that I worked, I started the story from the point of view of people who see themselves as targets of these systems, because I feel like, particularly when we talk about things like algorithmic bias, or or about some of the human challenges of artificial intelligence, and machine learning, and public policy, we rarely hear from the people who are really, in the trenches right now, particularly around programs like public services that are, you know, I talked about them in the book as being kind of low rights environments. Where it's, people are often facing decisions, whether or not to share their information with the state, under conditions that could be said to be sort of officially voluntary, but not really consensual, because, yeah, if you refuse to give the state your social security number or other information, and you can't access public services, if you don't access food stamps for your family, you don't have enough money to afford to feed your kids, then you open yourself up to a child protective investigation, potentially losing your kid to foster care. So while you can say, well, well, you know, people signed an informed consent about giving their their data to the state, it's hard to say it's truly consensual or truly voluntary. And I'm really interested in the way these tools work as mechanisms of social control. And that's why I was really interested in looking at the Public Service system. I think there's a lot of great work out there about these kinds of tools and criminal justice. And I actually think they're really important parallels, but I've seen less work around these tools being used, specifically environment, in environments where they are aimed at shaping the lives of poor and working people in this country. And so that's really why I wanted to do this, those three different areas, is that they they tend to be the programs that are targeted, at our poorest most vulnerable, and also working class families. So different programs, right. So public assistance in one place, homeless services, and another child protective and another in three different places like Indiana, Pittsburgh and Los Angeles, but a lot of very similar experiences, a lot of very similar challenges. And I think some similar solutions, as well.
Lane Raspberry
How similar were all the systems before the advent of automation?
Virginia Eubanks
It's a good question. Um, So one of the things that's really difficult about studying these systems in the United States is that most public service systems are federally funded or federally and state funded, but locally run. So it's not just that homeless services is different than public assistance is different than Child Protective Services. It's child protective, actually, it's called Children, Youth and Family Services in Allegheny County is different than Child Protective Services or ACS in New York City, which is different than upstate New York, which is different than
Lane Raspberry
All being federally funded?
Virginia Eubanks
Yeah, it's a little more complicated than that. But yeah, despite that most of the funding comes from the federal government, and federal and state. There is a huge degree of local control over the programs. And there's good reason for that, which is we should be responsive to local conditions, and if... but it makes it hard to study.
Lane Raspberry
Okay, is that the... so that's certainly must have been the way it was before the digital age.
Virginia Eubanks
Yeah.
Lane Raspberry
And to what extent does this advent of new technology, keeping the differences of local culture, and,to what extent is it making everything more more similar?
Virginia Eubanks
Yeah. So one of the places that that came up really specifically, it was in Indiana. So the system that I looked at in Indiana, was this attempt in 2006, to automate all of the eligibility processes for the state's Medicaid program, which is the health insurance program for poor working class people, cash assistance, and what was at the time called food stamps, and is now called Snap or Supplemental Nutrition Assistance Program. So this was a program that basically looked to consolidate the jobs of about 1500 local county based welfare caseworkers who in the past sort of accepted applications, often developed relationships with families over time, and help them sort of navigate through a really complicated and difficult system. So this plan, automated the, the eligibility process sort of took the application online, and also moved these 1500 caseworkers into privately run and regionalised call centers. So one of the things that happened, one of the things I heard from caseworkers their was that in moving from what was a family based system or a case based system in the past to the technologically facilitated system, which is known as the task based system, really changed the nature of their work. So where before they were responsible for and responsive to a caseload of families. Now, they responded only to the two tasks that were assigned to them in a computerized queue.
Lane Raspberry
So excuse me, so before with these caseworkers would they actually speak with the families
Virginia Eubanks
Yeah, often
Lane Raspberry
oAnd they would also pull the same families file repeatedly over time
Virginia Eubanks
Often
Lane Raspberry
And be familiar with that.
Virginia Eubanks
Yep.
Lane Raspberry
And what's the situation after?
Virginia Eubanks
So the situation after it's basically all the caseworkers were moved to a centralized sort of regional call center that was often 50 or 100 miles away from the place that they lived. And rather than developing a relationship with a set of families over time, they just responded to whatever task dropped into their queue on this sort of workflow management system. And so one of the real concerns that caseworkers had to this question about changing the local nature of the job is that say somebody was applying for food stamps, it looked like they weren't going to be eligible. Yes, in the past, the caseworker could say, oh, hey, looks like you're not going to be eligible. But there's a food pantry in your town, right? It's open Tuesday nights, you should go down and you should try it early in the month, because late in the month, they run out of food, right? So they would have they have that kind of local knowledge about the place that they live. And after the change. This was one of the major changes for caseworkers is they felt like their source of knowledge. Their expertise was no longer useful in this system, and it had a really profound impact on the families who were trying to access the system. So in the first three years of this experiment of automating the eligibility system the state denied over a million applications for public assistance. It was a 54% increase from the three years before the automation plan, and it had really profound effects on families including one of the stories I tell in the book is the story of Omega Young, who was an African American woman from Evansville, who missed a phone appointment to recertify for Medicaid, tried to call her local caseworker to let her know that she couldn't be at this telephone appointment because she was currently in the hospital suffering from terminal cancer. Despite her attempts to reach out and to tell people why she couldn't make the appointment, the new automated system, because she didn't make that appointment, said that she had failed to cooperate in establishing eligibility for the program and kicked her off Medicaid as she lay in the hospital dying from ovarian cancer. So she actually died March 1 2009. And the very next day, she won an appeal of her denial and all of her benefits were restored the day after she died.
Lane Raspberry
Because she lost her human advocate she was having her case negotiated by a machine such as it is.
Virginia Eubanks
And it meant she lost like free transportation to medical appointments, it means she was in danger of being evicted from her house like all this while she was trying to stay as healthy as possible through the last years of her life. Her family is really clear that they don't blame the state for her death. But it is really clear that she suffered needlessly in the last several years of her life fighting the system.
Lane Raspberry
You were generous or optimistic saying that when these caseworkers moved to the call center, they were taking their local community and move 50 to 100 miles away to a call center. Is there any effort or any reason to believe that these tasks centers, pull people from local communities perhaps to advocate for the local communities? Or?
Virginia Eubanks
Yeah, that's that's a good question. So I didn't ask anyone that specifically but I did talk to some folks who were had been caseworkers for a really long time. 25,30 years by the time I talked to them and had weathered the change, like had gone through this process.
Lane Raspberry
So being retrained to use this automated system.
Virginia Eubanks
Yeah, exactly.
Lane Raspberry
Leavign the old Skill set behind.
Virginia Eubanks
Yeah, Yeah, I mean, what they would, what they would tell me is that it felt like... One of them, Jane Porter Gresham said, you know, I'm a social worker, because I care about people basically. And she said, if I wanted to work in a factory, I would have worked in a factory, right. And what I felt like I was doing after the automated system was trying to sort of fit widgets into this box. And everyone who comes to you, when you are working in public services, working in public assistance come to you because they suffer some kind of trauma. They've, you know, there's been a fire, they've lost a job, someone's gotten sick, right, like something really fairly awful has happened in their life.
Lane Raspberry
These are hard jobs, that take talented people and don't pay a lot. So you have to have a reason to want to be there.
Virginia Eubanks
Yeah. And she said that really what she felt is before the automation that her job was about, like looking people in the eye and letting them know that it can get better. And that was the thing that drove her out of the profession. She basically retired after the automation, she sort of, she told me she tried to hold it as long as possible. But in the end, it was really affecting her health, and she ended up retiring.
Lane Raspberry
You started this book with quite a long historical narrative, the history of social welfare, activism, labor movements. And I took that to be an unusual place to start.
Virginia Eubanks
I thought so too.
Lane Raspberry
All right, I was gonna ask you. How original was that idea? Do you have any reason to believe that anyone who was designing these automated systems was looking to the precedent of other interventions in their design of these new systems?
Virginia Eubanks
Well, this is the moment that I always thank my wonderful editor at St. Martin's, Elizabeth Discart, because originally, the book started with a 93 page history policy. And she was like, oh, Virginia, please God, don't do that, please. And so she encouraged me to get it down to its felt little like 25 pages of history. But I actually think that history is really important, because we have this way to think about new technical systems, kind of like the monolith in 2001 A Space Odyssey like it just comes from space, but it has no context it like it's just lands on a blank slate and changes everything. And that's not the reality of how technology is integrated into society. There is a kind of deep social programming around these systems. And that deep social programming, I think of as sort of the legacy programming of these new systems that runs underneath the actual programming of the, of these systems, often in ways that are unacknowledged and that's the thing I don't think there's anything necessarily dangerous about technology, per se. What I do think is dangerous is that technology which is in the cases that I look at in particularly in public assistance systems or public service systems is politics, pretending it's not politics. And that's what's dangerous about it. The danger is this idea that these tools are fundamentally neutral, that that neutrality is actually what we want, like as a value, and that efficiency and cost savings are like , speed, integration, our most important values like, rather than values like self determination, dignity, fairness, equity. And so the reason I started the book with a history of the poor house in the United States, is because the poor house informs all the decisions we make in public policy today, whether we acknowledge it or not. And so I use in the book, I use this metaphor of the digital courthouse to talk about these algorithmic tools and predictive models and machine learning and artificial intelligence as being part of that history. So the way I talk about the tools and public services is often they're much more evolution than revolution. And in order to make better decisions about how we build these technological systems, we have to understand that history. And so I'm sitting in Charlottesville, Virginia, right now you have in Charlottesville, Virgini, it's a street called poor house road. And the reason you have a street called poor house road is that's where the Albermare county courthouse was. And I've been this is the weird way I get to know all new towns I go to, which is the first thing I do is look up where your courthouse was, and often try to go visit and I often leave flowers in tribute because many poor houses had unmarked graveyard. So they're still graveyards of generations of poor working people. I believe now where your poor house was, is now a country club. But I have not confirmed that. Yeah. So give me till tomorrow and I'll let you know. For sure. Wait, I wrote it down...
Lane Raspberry
Goodness.
Virginia Eubanks
The Glenmore Country Club. So the original poor house that was called the Albemarle Inn, and it was two miles east of the Keswick depot. So I don't know where the Keswick depot is, but I'm going to find out. As of 1899, they had 43 inmates. Most of these were built in the 1840s and lasted, the one in my hometown was standing until 1956
Lane Raspberry
Inmates is the appropriate term for the residents
Virginia Eubanks
That's what they call them. Yeah, that's what they call them.
Lane Raspberry
I see, you've talked about the human connection in these systems, the importance of keeping the people from the local communities with the local knowledge involved in in all these new interventions and new technologies and you said that how the history of previous human efforts evolves into whatever the new technology is. You seem to have struck a chord with many media outlets. So for your book, Automating Inequality, magazine, slate, the Atlantic, the Jacobin have interviewed you, you've given interviews to them. Commercial companies unusually asking for your interview at Surge and Sumo logic, Open Mind on City University of New York television. You've presented different universities, Berkman Klein at Harvard, philanthropy and civil society at Stanford given interviews for them as well. And on different radio shows, NPR is all things considered, the majority report with Sam Seder, and this is hell for W. N. U. R. in Illinois, you've had quite a tour with this book. And I find that unusual and exceptional. Why do you think so many media outlets are interested in this story that you have to tell?
Virginia Eubanks
Yeah, so I think we're in the middle, like people are now talking about this moment that we're in sort of the tech backlash. And I'm not sure that that's a fair way to describe the moment we're in. But what I can say about the core that the book struck was, I feel like the book came out at exactly the same moment, where people who did not have a lot of direct experience with these systems, particularly in public services, started to get really anxious about their own interactions with tech with, with database decision making systems in their, in their daily lives. So...
Lane Raspberry
Are you talking about the actual workers who are using these systems?
Virginia Eubanks
No, I'm actually talking more about sort of the wider middle professional, middle class, sort of reading and worrying public, right and all the sudden are like, Wait, what is Facebook doing with my data? Yeah, and one of the things that I was really interested in doing was sharing these stories from people who have been dealing with these kinds of have data extraction and invasion of their privacy and exploitation of, of their labor through sort of technological means for really 20, 30, 40 years. At this point, there's a lot of wisdom there. And there's also a lot of suffering that needs to be addressed. So one of the stories I tell at the very beginning of the book is about a woman I worked with, this is back in 2000, now who was part of a team of folks who co designed a tech lab with me at this YWCA, and my hometown, and one day, we were sort of sitting around, just sort of talking, sort of shooting the breeze about technology. And this was right after electronic benefits, transfer cards, EBT cards, came out and they're like, ATM like cards that States and increasingly have been loading public benefits on. So if you get cash assistance, or SNAP, your those benefits will go on this card, rather than being in like papers, paper stamps. And so we were talking about EBT cards, and I asked her says, like, Dorothy, like, what do you, what do you think about yours is pretty new? And she said, You know, there's some things it's better about, it's a little bit more convenient, maybe, right? It's maybe, there's a little less stigma when you use it in the grocery store, she said, but, you know, one of the things that really bothers me is that my caseworker, because it's a digital card, can call up a list of every place that I've been, like, every place that I've spent money. And so I'll go in and talk to my caseworker. And she'll say to me, for example, like why are you spending all your grocery money at the corner store, don't you know that going to the grocery store, is, is more than is, you know, is less expensive. And I must have had this like really sort of shocked look on my face that this is something that she was having to deal with, because she kind of like grabbed my knee and laughed at me for like a solid two minutes for being naive. And then she got a little quieter. And she looked at me and she says, Oh, Virginia, you know, you all, meaning sort of professional, middle class people, you all should pay attention to what's happening to us, because they're coming for you next. And I really feel like that moment, in a lot of ways was the moment that I really started to do this work in earnest. Not because I was concerned with warning middle class people about what was coming, but because I realized how different people's experience with these systems are, depending on their relationship to sort of larger patterns of power and privilege. And it's not just poor and working class families who face this kind of interaction with the tools of sort of the, the digital revolution, migrants, communities of color, people who live in public housing, right, any place where there's low expectation that people's rights will be respected are places where the government and private entities experiment on vulnerable populations with sort of new tools. And this again, goes very, very far back in history all the way back to the sort of anti grave robbing laws, that said that medical schools couldn't hire people to steal middle class corpses from graveyards. But what the state would do instead is give them for free bodies from poor house graveyards. Right. So we see this moment over and over again, where poor working class people are seen as experimental populations in ways that might even push science forward, but do so at a at an enormous cost to people who aren't always seen as an important part of the story.
Lane Raspberry
There's quite a few worrying stories in this book. You've been around, talked with a lot of people at this point, I want to ask you about two responses. One, what do you think of the narrative in your book? Is it hopeful, scary, neutral information?
Virginia Eubanks
It's not neutral information. I mean, I'm an I'm a neutral reporter. Yeah. As a reporter, it was actually one of the one of the things that was really important to me, was to find, to not go after the worst case scenario stories at all. And while I mean, Indiana's not a great story, like that's that, that might be a blackhat story, but in both Los Angeles and in Allegheny County, the tools that I talked about their coordinated entry in LA and the Allegheny family screening tool in Allegheny County are some of the best tools we have, not some of the scariest. So by best I mean, that the designers have often done everything that progressive critics of algorithmic decision making have asked them to do, they've been largely transparent about what's inside them, their models or their tools. They've put the tools in public agencies. So there's some kind of accountability, democratic accountability around the tools. And in some cases, they've even used some degree of sort of participatory design with people using the tools to develop them in a way that's sort of fair and thoughtful and takes lots of different people's knowledge into account. And I did that on purpose, because I think it's quite easy to sort of like, pick three case studies, and that are just shooting fish in a barrel like, and it'd be really easy to write a simply, a story that, or a set of stories, that just make people paranoid. That was not my intention at all. And I don't think that's what the book does. I think what the book does, is raise really important questions about the social justice issues at the heart of the kinds of technological decision making, we're making around economic inequality today, and bring the voices and the critiques of those people who are really facing the pointy end of the automation stick into public dialogue. So I don't think that I'm all at all sort of selling fear in the book so much as acknowledging that there are some really tough experiences and some really smart critiques from people who see themselves as targets of these systems. I think overall, that the story that the book tells, while not necessarily easily optimistic, is a rousing one, right? One of the things I say over and over again, in the book is that systems like these don't crack down on poor and working people, because they are helpless and vulnerable. They crack down on us because we're powerful, and scary. And we have the numbers. And it's an attempt to control. And you can see that if you look at the history, that these, that the tools of social control, poor and working people change at the moments when we are feeling our power. So I actually think the story is quite, if not optimistic, hopeful, rousing. And it connects these issues to other kinds of social issues, that movements are organizing around in ways that I think are really meaningful for how we move forward.
Lane Raspberry
I'm glad that you felt a rousing public response,
Virginia Eubanks
I feel aroused by it, but then I'm biased.
Lane Raspberry
Let's talk about how to make people more roused. So we're at a university right now, there's so many faculty and staff here talking to students about predictive analytics and all these issues that you raise. What role do you see for universities to play in this discourse about automation? How do we rouse people?
Virginia Eubanks
Yeah, that's a great question. So I mean, I so I still, I'm currently I'm a halftime investigative journalist and halftime in the political science department at SUNY Albany. I feel like it's an enormous privilege to have conversations with young people in classrooms and other places on campus. I feel like universities have extraordinary resources, for having conversations, but also just like in terms of actual resources, sort of hold on to a lot of the tools, a lot of the finances, sometimes networks that are meaningful for this kind of work. I think what I've learned after being in a university, so what are we going to include grad school, if we're including grad school, being in universities on and off for 22 years, is that the fatal flaw of universities getting involved in this kind of work is that they come in thinking they hold all the expertise, and that what they're doing is charity, is giving kindly of themselves, and their resources to like, for example, the communities in which they're situated. And I think that is enormously dangerous, um, and a way to really reproduce these kinds of problems rather than really interrupt them. I think universities need to come at this with a kind of modesty that acknowledges that we have very specialized and important expertise, but it is only one form of expertise and often we make really big mistakes. When we move into doing work that has really profound direct concrete impacts on the ways people keep their families safe and healthy or failed to. And so we really need to learn from the people who are most affected, we really need to learn to produce structures that force us to share power, even when it's uncomfortable, including things like power over research questions. That was one of the biggest challenges for me as a graduate student. When I first started doing participatory action research.
Lane Raspberry
You've mentioned that several times, participatory action, the importance of it in everything.
Virginia Eubanks
Yeah, so it was important to me like, and it's not a specific method at all, you can do any kind of research method in a participatory action research with a participatory action research orientation. The thing that is different about it is you're doing research with communities rather than on them. And that actually been some really challenging things for academics, including things like giving up a certain degree of control of your research agenda. So instead of saying, which is what I did, when I first started doing it, going into a community and saying, I'm here to help, right, like, the problem is the digital divide. So let's build technology resources. That's basically what I did at the Y. And two years into this project, bless them, the ladies of the Y sat me down, and they were like Virginia, we love you, we think you're really sweet, but all of your questions are stupid and have nothing to do our lives. And it was just such an incredibly generous thing for them to do. And it totally shifted the way I do this work, that I realized I was coming into the work with preconceived notions about what people's problems were, that weren't, wasn't their interpretation of what their problems were. And that I needed to enter the space with more humility, more modesty, and really be willing, able and excited to learn from folks who had experiences I didn't have. And so yeah, in the long run, I think that's one of the luckiest things that ever happened to me was the intervention from the women at the Y.
Lane Raspberry
I'd like to describe a situation for you. And I wish you could respond in some way, Is this an accurate modeling? And then I've got a question about it. So there's governments and corporations who are designing these automated systems, there's the people who are targeted about it. I sort of imagined that universities are kind of a third third, party in this, and somehow universities could do something to have the corporations or government shift some of the power or control to the people who are targeted about this. Is that a role that universities could play? And if so, what's an obvious way that universities could help transfer power?
Virginia Eubanks
Well, so I'd love to see that. I mean, what I mostly see right now is the opposite, which is that academics are producing these tools for the state, and then often spinning them off into entrepreneurial endeavors that become private company. So if you look at pride poll, for example, or other kinds of predictive policing technologies that was produced at UCLA, or USC, I don't remember which, sorry, it's right, and then spun off into
Lane Raspberry
A company, a very big one.
Virginia Eubanks
Yeah, a private entity. So I would love to say that academics, you know, are on the side of targets of these tools, but often, we're not. And I think that there are really smart, really well intentioned people working in the space of producing technologies for the state. And doing great work around that. But there's equal numbers of folks who maybe don't know as much about the policy area, and the history of the policy area and power as they should. And then they're walking into situations. And then they're surprised when their work is used against what they think the original intention was. And so one of the things I talk about a lot when I talk about the book is like, how do we build these tools in ways that sort of harden their ethics so that they can't be used against their original intention? I'm not sure that that's possible. And I've found many of the folks working, and many of the academics working in this space, haven't thought a lot about that. I've actually, I actually asked the academic team that produced the Allegheny family screening tool, I asked them directly like well, so right now this tool produces a score that basically scores every family with a number from zero to 20. That says how high a risk they are for abusing or neglecting their children. And everybody right now, the administration and designers of the tool say like we never use this to take kids out of their families. We're only using it to make sure that they get extra resources if they need them.
Lane Raspberry
What happens in practice?
Virginia Eubanks
And I asked, and you know, so right now, largely people trust this administration and the community sort of given the administration the flexibility to sort of experiment with this. But what happens if you get a new administration in who is not oriented towards the community in that way? I asked the academics like in that case, what would you do? And I remember they told me oh, well, in that case, like, we try to put something in our contract that says, if they use our tool against the intention that we really wanted it to be used for, then we'd have the right to say something about that. And I was like, but like, that actually doesn't seem like actually building a lever in where we can say, like, look, look, this, this is not acceptable, like using this now to risk rate every family in the county, and to pull kids out and put them in foster care is unacceptable. And they're like, is there some way like, we can install kill switches and these things, right, so they can't be used in ways that we find personally morally reprehensible? And that's not something we're thinking about as much as we should be.
Lane Raspberry
Alright, so that's a good description of the university relationship, something, perhaps long term, the way the universities relate to communities over the long term. So shorter term intervention that universities have, we continually get students in and send them out into the workplace. We're a school of data science, what kind of skills or conversations should we be having with the students who are not part of academia and who are gonna go out into the world?
Virginia Eubanks
Yeah, so it's so interesting, I speak a lot to two groups of students who are in data science. And I definitely when I like do lectures, pretty much the first or second question I always get asked is, it's basically a way of phrasing, like, give us a five point plan for not doing the things you're talking about in the book. And I always say, like, you know, I wish I could, I wish I could give you a five point plan that's like, Oh, here's how you protect against this happening. But the reality is, like, the the systems that I'm describing the technical tweaks to the systems I'm describing, the systems have existed for a really long time. Um, and they're really deeply embedded in our culture and our politics. And they're going to be really hard to shift on. And if we are trying to push the world towards social and economic justice on that as long term long haul work. So data scientists are who care about social justice are like a special kinda unicorn that I'm always like, looking to find ways to wrap them in bubble wrap and protect them. Because it's, it's such an incredibly important set of skills. But an equally important set of skills and knowledge is like really knowing about the people your tools are most likely to impact. So if you're working on a tool that is going to be used in homeless services, like you best go out and meet some unhoused people. And that's on us as people who work in in the field, to really know enough to be able to predict what might happen with our tools. Before we we engage in the, in that work. And I know that's actually a really high ethical bar to set for folks. But I also think, given the the world we live in, that it's a it's a really necessary one.
Lane Raspberry
All right, I want to ask you a critical question. So there's this ongoing conversation about, should we be more bold in our automating more systems? Or should we be more hesitant? So you answer the question in different ways. But I wonder in all your research, can you tell me what evidence gaps, you see that people make arguments on shaky, shaky evidence in one direction or shaky evidence on another? What kind of information do you wish existed? Or do, do you find to be absent?
Virginia Eubanks
That's great. I mean, there are some examples of folks who are actually doing like the data work to fill some of those really important gaps. I don't know that I'm going to come up with the name of this project off the top of my head. But some folks who are related to data for black lives, for example, have been compiling. officer involved shooting data nationwide. Yeah. And that's just like a hugely important data need. So those kinds of projects are happening and I think are really important. The other thing that I often say, and I think is really crucial is like, if we have all the sort of data scientists and technologists we need to build systems like the ones I talked about in the book that divert people from public assistance, then we have all the data scientists and information, we need to actually reach out to people who are eligible for public assistance and are not getting it in order to facilitate them getting public assistance. So in some states, it's only like the receipt rate for something like cash assistance is like 7%. So like 93% of people who are eligible for cash assistance in Indiana, don't get it. So again, but that's like a that's a political change. Like, we currently can't do that, there are some states, you can't even advertise that food stamps exist, because the states are so focus on diverting people from programs rather than bringing them in, that they think advertising is an advertising a program would be a failure of the program, because people would then use it.
Lane Raspberry
So wait, let me ask you, there's all these systems, which help screen people who are accessing public benefits. And you're saying that the tendency of these systems is to divert people away, or to deny access? Can you tell me about the systems that are being designed and where the investment is going to make systems which recruit people into public benefits?
Virginia Eubanks
Yeah so, I'm saying that those don't, don't really exist? There are, I'm saying that basically, we have the expertise and the information to do that right now. And we're choosing not to. I say, um, that's not true across the board, there is like, for example, a really great organization in Chicago called m relief, lowercase m, like mobile relief. Okay. And one of the things that m relief does, it's really amazing is, so one of the scariest things for my family about applying for food stamps, is that, particularly when you apply online in the state of New York, there's like 20 different screens that ask really, really pretty intense questions about your family and your life. And you have to fill out every, every bit before you sort of hit the arrow to go to the next page. And then after you've answered questions that are like, you know, seem pretty crazy. How many refrigerators do you have in your house? Like, tell us like how you prepare food. Do you prepare food together? Do you, you know, like, all of this really pretty intense information about your family, at the end, there's basically just a button that says, Go, and that button releases all that information to the state, before you even know whether you're going to be eligible for the program or not. So it's just you're just taking a dice roll, like you're just like, Okay, I hope you use this the way you say you're going to use it, and you're not doing anything else with it. And I hope I'm going to be eligible for this program. But at the point, you hit the big button, like you don't know, either, there's
Lane Raspberry
So much vulnerability and so much commitment for people.
Virginia Eubanks
It's a lot of trust, like a lot of trust, and most people wouldn't do it if they had any other option. So what m relief does, which is so interesting, among the things, they do lots of different things, but one of the things they do is help people get on, snap on food stamps, CalFresh, specifically, California's version of food stamps, and one of the things they do is they they actually have you apply through their app, they hold the data in and keep it anonymous, they ping the system to see if you're going to be eligible. If you're going to be eligible, they then turn around and say, Okay, it looks like you're going to be eligible, do you want us to release your information? So it's like one, they do a lot of other things. But that's like one very simple change that is actually really profound. Because what it does is give self determination back to people who are often feel like they're in a situation where they don't have a choice about what they do with their family's information. And I think that's super crucial, particularly because these are political systems. And as political systems, they are teaching people what government is, and if what government is, is like enormous, terrifying data vacuum cleaner. Like that's not that, that's not how we want people to think about government. And that's not how we want government to sort of operate is as this sort of frightening, life changing privacy invading system. So I mean, there's all sorts of reasons to think about doing this work better. I think the primary one is because of the basic inalienable right to dignity of all human beings and, and organizations like m relief, I think do a great job at bringing basic values like dignity and justice back into the work.
Lane Raspberry
All right, so about dignity and justice. We've talked so much about automating inequality, I wanted to ask you about another one of your books. Ain't Gonna Let Nobody Turn Me Around: 40 years of movement building with Barbara Smith, so this was an activist, could you tell me about her and yeah, why you're interested in this person?
Virginia Eubanks
Well, so Barbara Smith is this extraordinary individual. She was one of the original members of the Combahee River collective, which was a this amazing collective of black feminists in the late 60s and early 70s started out in Boston. And they're famous for, for producing work for a number of things, but among them producing the Combahee River collective statement, which was this really clear, beautiful articulation of black feminism in the early 1970s, where we get, you know, such terms as identity politics, and, and we believe intersectionality, or at least what they called interlocking oppressions. So, she's this incredibly profound, brilliant organizer and writer who later because they decided as women of color, writing for their own survival that they needed to control the means of production later, created their own press called Kitchen Table, women of color press, which published some of the most important books in women of color feminism in the United States, including this bridge called my back. And all the women are white, all the blacks are men, but some of us are brave. So she later became an elected politician in Albany, which is near where I live. And it's just an absolute sort of hero in local organizing, and in in my life as as a welfare rights organizer. I came into contact with her quite a lot in Albany. And basically the story is when Barbara Smith asked you if you want to do something for her, you say yes. But we, my colleague, Alesia Jones, and I produced this book that we think of as a sort of mixed up reader, that sort of raises some really important questions about how to do social justice organizing through her life through the stories of her life. So it really starts with her civil rights organizing in the 50s and 60s, all the way through her her work as an elected official in the in the 2000s. And it produced, it creates this kind of conversation between early writings of hers, writing about her movement, ephemera, like buttons and flyers and, and also sort of historical organizing material in ways that help us think through how to do multiracial, feminist organizing, attuned to class now, and it's been really great to watch how this sort of really incredible moment and racial justice organizing, has been able to reconnect with Barbara's work. Not just because of our book, but with, with that book as one of the resources. So yeah, she's just this extraordinary woman. And it was such an incredible opportunity to learn from her about organizing and about living life with integrity. And she's She still lives in Albany, and she's still organizing, and I believe she's 76 now, and she's just still still a hero.
Lane Raspberry
That's great. Can you tell me something about especially, you put so much emphasis on what you did in the 60s and 70s, how can that inform anyone who's working in new technology today? Or to what extent is that possible?
Virginia Eubanks
Yeah. So I mean, I think one of the fundamental, one of the most important, so many important things, but one of the most important things that come out of that tradition, is the idea of intersectionality. The idea that, identity politics. So though identity politics, has been given a bad name, by folks who have a sort of, I think, really flat interpretation of what identity is, one of the things that was amazing about Combahee, was as the way they thought about identity politics is just saying like, as generally, black women, many of whom are lesbians, and many of whom were working class, we have a right to organize based on our own experiences, and not to try to fit ourselves into anything else that doesn't fit us. And actually, not only does that make space for us, but that produces a better politics, because we live at so many intersections of oppression. That freeing, say working class, black lesbians means free pretty much everyone else because you have to address patriarchy, racism, classism in capitalism, and a number of other things and in order to free us. And so sometimes people interpret that as a sort of oppression Olympics, like you, when the more oppressed you get, but that was never the intention. The intention was to say that we all, we live in a web of social relationships and power relationships, where there are parts of us that carry privilege, and that there are both benefits and problems that come with that. And there are parts of us that carry oppression. And there are both benefits and problems that come with that. And understanding ourselves better. And understanding those intersections where we that we inhabit better means that we're going to be better at politics, and we're going to actually be better at building coalition with other people. It's not about because I'm the barber would say, like, you know how limited my politics would be if I only cared about those who look just like me. And that was never the intention of intersectionality, or identity politics. It was always about multi issue politics, it was always about building coalition. And I think that, that, that matters, not just for organizing, but for doing this kind of work in technology as well. Understanding things like how power works are really important to that work.
Lane Raspberry
That's very inspiring. Thanks for sharing.
Virginia Eubanks
Oh, I love Barbara Smith. She's amazing. Everyone should immediately look her up.
Lane Raspberry
You've got another book that I'd like to mention digital, Digital Dead End: Fighting for Social Justice in the Information Age that you wrote in 2011. Way back in the day, I'd like to ask you what changes you've observed, because there's some similarity between the theme of that book and the theme of your 2018 book. First, could you summarize what this book was about? Yeah. And then could you tell me, what have you observed as a significant change in that time?
Virginia Eubanks
Yeah, so Digital Dead End, and was the book about the work I did at the with the community of women at the YWCA, this this five years participatory action research that I did there. And it has sort of three, three halves, if I can say that, so the first half is what I think about as sort of the real world of information technology. So that was really about the conversations that I had with women who lived in the community about how these ideas about high tech development, were affecting their experiences and sort of their day to day lives. So how does it affect sort of regional employment prospects? Like how does it affect your interaction with technology in the systems you've come into contact with in your day to day life? And one of the key insights there was the the the conversation I had where they sort of corrected me, not about the EBT, specifically, but about sort of the way I was framing questions, your survey question. Yeah. So they, one of the lessons they offered in that work was that the problem was not that they lacked access to technology in any kind of simple way. And in fact, that technology was really ubiquitous in their life. But the kinds of interactions that they were having with it tended to be exploitative and demeaning. So they were coming into contact or disciplinary, so they were coming into contact with these technologies in the high wage, low tech, I'm sorry, the high tech, low wage workforce, in the criminal justice system and in the public assistance system. And so the first half of the book sort of talks about that. The second half of the book then says like, Okay, if the given is that folks are having these often very difficult interactions with technology in their day to day lives, how do we think about sort of critical technology education in a way that takes that as a starting point, rather than ignoring that reality? And so the second half of the book talks about sort of what we came up with what we call popular technology, which is a way for people to come together, and to talk about their interactions with technology in sort of a critical, thoughtful, and productive way, that way that was oriented towards justice. And then the third half the reason I call it a third half is because there's a huge appendix on the end of that book, that basically just gives a ton of documentation of the kind of work we did. So meeting minutes, and popular education exercises, and all sorts of stuff that give everybody the resources they need to do the work for themselves in their own communities.
Lane Raspberry
I wish more people would publish those kinds of supplementary
Virginia Eubanks
I loved that it was like, it's like a 60 page appendix. And I think it's, I shouldn't say this, but I think it's the best part of the book. Because it really it gives you this incredible flavor of like what that work felt like. And I think what's so important to take from that even though we were dealing with offensive very difficult situations, and some folks who really struggling to meet their basic needs. Like we took so much joy in the work that we did, and we all learn so much from each other. And it was so inspiring to do the work and so part of the point of sharing that material is to to inspire people to like not just see this as like, we have justice, like oh, we have to pay attention to justice. So hard it'd be much easier if we didn't have to think about this. The, the reality is like the work itself is hard. It's challenging, it's crucial. But it's also amazing, the company is really good. I have become so much less cynical person and so much more optimistic person since doing this work. Just because I see people who really are facing incredibly difficult situations in their lives, and nevertheless, do collective work oriented towards liberation with just humor and generosity and courage. And I just feel so privileged to be able to do it
Lane Raspberry
And incorporating it in their day to day lives. It's not something that's added on to their work duties, it's completely a part of their routine. Yeah, that's great. Can you say something about the mood of technology? Are these kinds of interventions in 2011 versus 2018? I should even be asking, you know...
Virginia Eubanks
That's a really interesting, that's a, that's a great question. Um, yeah, so I actually feel like I've been doing this work for a long time. So I feel like this is, in some ways, the work that I left the bay area to try to figure out. And I certainly haven't figured it all out. But I mean, I think one of the things I see that is really inspiring, is that increasingly, social justice movements, whatever their other areas of sort of focus and interest are, like, whether it's racial justice, whether it's LGBTQI work, whether it's feminism, whether it's labor, um, I feel like so many movements are also thinking about the way these tools impact their work. And they're thinking about it in, in such interesting, such meaningful, such important ways, right. Like Edward Snowden did not discover digital surveillance like that, that has a long racial history in the United States. It has a long class history in the United States. And this work has been going on for a really long time. One of the things that I think so interesting about these tools with these technology tools is that they make those relationships so concrete, that they can offer really powerful points for organizing around because they make the social and power relationships so visible. And so I think I see, there's a lot of that work happening now, in a way that it felt very encoded it back in 2011, it felt like, we were really interested and figuring out this work, and other people were interested in it as well. But we didn't quite have the language to talk to each other yet. And we weren't really sharing experiences in that way yet. And I think that's really different today. Like just last week, I was talking to the President of the Australian unemployed workers union. And they've been fighting against this system that's called Robo Debt there that basically has been trying to collect money from folks who are supposedly overpaid public benefits in the past. And he's talking about, right, creating basically worldwide unions of the unemployed. But organizing around these kinds of tools. The UN Special Rapporteur on extreme poverty and human rights, just in October produced an amazing report about the relationship between artificial intelligence machine learning, and sort of the digital welfare state that I think is going to be an incredible resource for organizing like, I just I feel like the conversation is so much more sophisticated than it was a couple of years ago. I feel like the the way people are thinking about resistance, and self defense is really different than it was a couple of years ago. I feel really excited about the way these conversations are going.
Lane Raspberry
I appreciate the positive attitude you've had about everything we've discussed this evening. Is there anything that I've failed to asked you that I should ask you, anything more that you'd like to say about any of this?
Virginia Eubanks
No, I just want to re affirm like how important it was that the folks I spoke to the families I spoke to who are targets of these systems in Automating Inequality, what an incredible risk it was and how grateful I am. That they trusted me with their stories. It's an incredible mitzvah, like it's an incredible gift of generosity and kindness, that they agreed to make themselves more vulnerable so that other folks would feel less alone. And so I just always try to reaffirm their incredible generosity. When I'm talking about the book.
Lane Raspberry
It's like you said you have some rousing ideas that inspire people and make them dream for a better world. Make I mean, I have to, I have to thank you so much. Thanks for speaking with me.
Virginia Eubanks
Yeah, that's great. Appreciate it.
Monica Manney
Thanks for checking out this month's episode. We'll be back in March to bring you another conversation. In the meantime, if you haven't already, check out Season One of UVA Data Points. We're currently working on a second season, so keep an eye out for an announcement about this later in the year. Also, if you liked the podcast, let us know. Give us a rating and review. If you need to contact us, send us an email at uva data
[email protected]. We'll see you next time.