Fighting Data-Driven Inequality (Guest: Virginia Eubanks)

Media Thumbnail
00:00
00:00
1x
  • 0.5
  • 1
  • 1.25
  • 1.5
  • 1.75
  • 2
This is a podcast episode titled, Fighting Data-Driven Inequality (Guest: Virginia Eubanks). The summary for this episode is: <p><span>Our guest today has been on a long crusade to raise awareness about how our digital tools are continuing and exacerbating the problems we already have around poverty and inequality. Virginia Eubanks is an Associate Professor of Political Science at the University at Albany, SUNY. She is the author of the book “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor”.<br></span></p>

Podcast Name: Masters of Data
Episode Name: Fighting Data Driven Inequality (Guest: Virginia Eubanks)
Ben: Welcome to the Masters of Data Podcast, the podcast that brings a human to data. I'm your host, Ben Newton. At this point in the digital age, computers and algorithms have worked their way into every aspect of our lives, but this technology is created by humans so it has a tendency to amplify both the good and the bad about our society. Our guest today has been on a long crusade to raise awareness about how our digital tools are continuing and even exacerbating the problems we already have around poverty and inequality.
Ben: She is the author of the book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. So without any further ado, let's dig in. Welcome everybody to the Masters of Data Podcast and I am very excited to have Virginia Eubanks with me here today. Welcome, Virginia.
Virginia: Hi Ben. Thanks for having me.
Ben: Virginia, you're an Associate Professor of Political Science at the University at Albany, SUNY and you are the author of a book that I really enjoyed called Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. I'm really excited to talk about what you're working on in the book today, so it's great to have you here.
Virginia: Yeah, this is going to be fun.
Ben: Like we were talking about before, I always love to start out and understand the person. I mean, it's not just about what you've been working on, but also how you came to be where you are and the experiences that shaped you, so talk to me a little bit about where you came from. How did you become a professor? Why did you get interested in kind of the political activist side of things?
Virginia: Yeah, so I have a super weird series of origin stories, but I'll keep it to the most relevant to the book. So my activist background has been for 20 years or more now in both community media and community technology centers, so I come out of this tradition really in the mid-'90s when I was living in the Bay Area of California of thinking through what it means that access to high-tech tools is unevenly distributed.
Virginia: So one of the things I was involved in as a younger person was in building community tools in collaboration with the communities that would be using them. I did that work for a really long time. I feel like that work is incredibly important, so it was really sort of oriented around this idea of the digital divide and closing the digital divide. Like I said, I think that work is super, super crucial and there's still great organizations doing that work, but also one of the things that happened to me early in my life, in my sort of thinking and writing life, was that I worked with a community of women who live at a residential YWCA here in my hometown of Troy, New York who really challenged this idea that they lacked access to technology.
Virginia: It became just really, really important to the way that I think about the relationship between technology and social change. Primarily what they said is like this idea that we lack technology is not right. In fact, we have a ton of technology in our lives every day. We interact with it in the low wage workplace and in the public assistance system and in the criminal justice system and in our homes if we live in public housing and in our neighborhoods.
Virginia: So it's not really true to say we don't have interaction with these tools, but the reality is most of our interaction with them is actually pretty negative. That specifically was the case for folks when they talked about the welfare system. So I usually tell this little story as one of the origin points for Automating Inequality. This story is that I was sitting with a woman who I had worked with for many years to develop these technology resources at the YWCA. She goes by a pseudonym in the book, Dorothy Allan.
Virginia: So Dorothy and I are sitting in this tech lab that we have built together one morning. We're just sort of shooting the breeze about technology and I had asked her about her EBT card, her electronic benefits transfer card, which is like the debit-like card that you get public assistance on if you get cash benefits or food stamps. So I had asked her about her EBT card and I said, "You know, a lot of people, really they find them more convenient. There's like maybe a little less stigma when you're in the grocery store. You're not pulling out actual paper food stamps. Like how do you feel about your EBT card?"
Virginia: She said, "Oh you know, that's mostly true. I guess it's a little more convenient. I guess, you know, it sill feels like people can tell that you're using an EBT card, but yeah, I guess it's a little bit less than actual paper food stamps." Then she said, "But my case worker uses them as a way to track all of my spending, and so she's actually also tracking all of my movements." I must have looked super, super shocked because she kind of like laughed at me for a couple of minutes for being naïve.
Virginia: Then she got a little more quiet. She was like, "Oh Virginia, you all," meaning professional middle class people, "You all should pay attention to what's happening to us because you're next." That was 2000 so that was 18 years ago, and I feel like Dorothy's insights are really, really important to anyone who wants to do a sort of technology analysis or thinking well about technology in relationship to inequality in this country because she both was incredibly generous, right? I think that was such a good move, such a good hearted move to say like oh, this is already happening to us, but I'm also concerned about other folks because we actually share these experiences.
Virginia: I think that was really generous, but also I think the most important thing, the reason that Dorothy's voice is always in my head when I start new projects is because she made it really clear that the folks who are the real experts are the folks who are facing sort of the least controlled technologies in the most direct ways. Like in ways that it really is impacting the quality of their every day lives, that it's really impacting their ability to meet their basic human needs for things like shelter and safety and food and family integrity.
Virginia: So she really forced me to rethink like who are the real experts that we need to be talking to? So we need to be talking to policy makers and we need to be talking to data analysts and we need to be talking to the economists who are building these models, but we also have to be talking to people who see themselves as the targets of these systems. Not only because it's the right things to do, it's like the morally right thing to do, but because they have better information.
Ben: It's really interesting when you say it because I do remember reading that story in the book and I had to take a pause because I was like wow, that's pretty profound just the way she phrased it. But one thing that you were saying there about understanding the people on the ground, it's really interesting because I feel like there's a parallel right now going on in how people are trying to design technology.
Ben: They talk about understanding the experience of the people using the product with design thinking and all these other things that are going on, but we use it in that aspect but then we're not using it in ways like with in this case with Dorothy and these people that are using the benefits and EBT cards, we're not actually applying it in that case. It is really interesting. We acknowledge it in one area but we don't acknowledge it in another.
Virginia: Well, we acknowledge it when it pays because it means that users will be better customers. Right?
Ben: Yeah. Fair enough. Fair enough.
Virginia: Yeah. So I actually think that this is a major area where class really matters to thinking well about technology because we have this tendency, I mean we at large as like folks in the nattering classes who like talk and write about issues like this, we have a tendency to rely on our own experience as a guide for thinking through issues around technology and that makes perfect sense, but it's also a little bit lazy.
Virginia: What that means for sort of technology advocacy and analysis is that we have a tendency to assume middle class experiences with technology and the primary middle class experience with technology is as a consumer, right? You're a user of apps or software. You're a consumer of hardware or the internet of things, and you get at least some degree of power as that kind of consumer. I don't think that is actually the best way to describe the interaction that folks have with technologies in the public assistance system because they're in positions where often they can't make a meaningful choice whether to engage with the systems or not.
Virginia: So it's easy to say like for example, okay, so if you don't want to give 50 pages of your most incredibly intimate information to the state with no idea how they're going to use it, then just don't ask for food stamps, right? Just don't get public assistance, but the reality that basically ignores as Cara Bridges has done great work pointing out, that ignores the fact that the Child Protective System exists and that if you don't have something like food in the fridge or heat in your house, that actually is textbook child neglect and you face losing your children and losing your family integrity.
Virginia: So that's not a meaningful choice. We're not saying like it's not the same as whether you use Google or Duck, Duck, Go, right? This is a really different, the choice is different in its intensity and its nature and unfortunately we have a tendency to assume that most people interact with technologies as like this more value-neutral interaction of the consumer. It's not that consumer privacy issues for example aren't really important.
Virginia: They're hugely important. We absolutely have to be paying attention to it, but that framework, that way of thinking about these technologies doesn't work for the folks I talk to. We need to think better about the way these tools work across the different kinds of experiences we all have.
Ben: That makes a lot of sense, and I remember we were talking earlier about the podcast that I, the episode I did a little earlier with Kathy O'Neil, and I remember reading that and I was taken aback by as she describes these algorithms that are what she calls the scary algorithms that are something that are actually affecting people's life choices. I think there's a tendency for those of us in kind of the middle class, upper, middle class, whatever to think like well, technology's kind of benign. It's just part of like I have a choice, but then to think how technology is now directly impacting people's lives in very profound ways, in ways that they can't control, I thought was really interesting.
Ben: One thing too with going through your book, Virginia, I think what I really enjoyed is I thought you brought a very human side to it. You know, the story you just told and you opened up the book with your own personal experience with when you explained about what happened with your partner and the whole insurance fiasco, I think that really hit home to me too because it's like wow, that could happen to me. I could imagine that very easily.
Ben: I think that was really interesting way to start the book because you're humanizing it. Was that part of the reason why you wrote the book as well or is that really just kind od drive it, brings some more emotion to it?
Virginia: Well yes. The reason I framed the book the way I did is because I just think the kinds of voices, the folks who see themselves as targets of these systems have largely been ignored both in the ways we define the problems and in the ways we imagine solutions. I think that that's a mistake. Like not only again, like it's not just immoral. It also is just empirically wrong.
Virginia: So I have a real commitment to telling the truth about these systems and that means telling all of our truths about the system, not just the folks whose experiences is like our own. I think we have a tendency to have kind of a future orientation around the problems that might be created by technology, right? So we have a tendency to think ahead, like what might happen, and we stay like ahead on the leading edge of the sort of most interesting, difficult science fictiony problems.
Virginia: Automated car, given the choice between hitting box of puppies and a bicyclist, you know, which one will it hit? Again, not that that's not an important problem, it actually really is, but it is not necessarily something that's happening to people right now. In the cases that I talk about in Automating Inequality in the social service system, these are all real impacts that are happening to real people right now.
Virginia: Because actually this stuff, these kinds of tools have been integrated into the social service system since the late '60s and early '70s and that's profound. This has been happening for 50 years already. So projecting into the future for me feels pretty dismissive of the real concerns people have about how these tools are affecting their lives right now. So the goal was really to highlight the voices of folks who are being most impacted and make sure that they are able to sort of set the terms of the conversation.
Virginia: Of course I like then went on to talk to all sorts of folks, policy makers and lawyers and folks who design systems and economists and data scientists, but I really wanted to make sure that the voices of folks who are most impacted by these systems were centered in the conversation.
Ben: That's great work you're doing there because at the end of the day, and I think you even mention it in some ways, there's a tendency to kind of put up this emotional barriers. Like okay, it's those people, there's a system going on over there. I don't actually have to connect with them on a human level, but then when you hear these stories, and you go through a few in the book where just those very human interactions with the technology, I think that brings it home because then it actually touches you in a different place. You can't really ignore it then. It actually penetrates.
Virginia: So I was before this book primarily an academic writer and I've transitioned in the last five years or so to being what I think of at least primarily as an investigative journalist. Part of the reason I did that is because I've been doing economic justice activism and research around economic justice for 20 years and it didn't matter what the outcomes of the research was or how good our strategy around organizing was that we always got stuck in these very frustrating and inaccurate stories about poverty in the United States.
Virginia: So it really, this book was very much a very explicit attempt to tell the story in a different way that might reach people in terms of their emotions and their sense of connection to other people rather than sort of telling them what they should think. That required a lot of retraining for me because academics are not good at telling stories without telling you what to think, so I really had to work on that. But you know, the upside, and this has been part of my work for a long time, but the upside is that I just feel so lucky and so grateful to have been able to talk to all these really incredible people.
Virginia: So just smart, scrappy, funny, brave, deeply courageous folks who shared their stories with me. I want to make sure that people understand what an incredible risk they took going on record using their real names and telling their stories, because the great majority of folks who were like affected families who I talked to in the book are currently on public assistance, are currently unhoused or are like currently undergoing an investigation for child neglect or abuse, and so actually taking this enormous risk sharing their stories with me.
Virginia: That takes just so much courage and I just want to make sure that people really understand what a gift that is to give to the folks who are reading the book.
Ben: Yeah, and I would say being completely honest, when I first was going to sit down and read the book, that is not what I expected. I think it was in a good way it hit me, it penetrated more deeply that way because I didn't expect to see the personal stories. It definitely does bring it home. You know, one thing to ask about, so you've basically written a book because you talk about the digital poorhouse, how we've basically taken conceptions and tendencies and kind of digitalized them and perhaps amplified them.
Ben: So now you've been out telling this story, people have read the book, you've been able to get this out there. What's the experience been since you got this out? What have you been seeing? How have people been responding to it? Have you seen any changes? Does it change your thinking in any way? What have you seen?
Virginia: So there's two things there. I want to say a little bit about both of them. The first one's about this sort of metaphor of the digital poorhouse, and I find it a really useful metaphor and so I just want to say a little bit about it. So the idea there is we not only have a tendency to talk about technology as if the problems are all going to happen in the future, we also have a tendency to talk about it as if it came from nowhere.
Virginia: So like it just like arrived like the monolith in 2001: A Space Odyssey from space and it landed on like blank earth and changes everything. We have this story we tell ourselves about technology. Of course that's not how it works at all. Technology's built by humans. It carries with it human assumptions and preoccupations. It is a deeply social product and it then sort of loops back to affect the culture that it emerges from.
Virginia: So I use this sort of phrase the digital poorhouse to really contextualize the systems that we're seeing in public assistance and the reference I'm making is to the actual brick and mortar poorhouse, which were these institutions for incarcerating poor working class people who asked for help from public aid. So in order to receive any kind of public aid you had to agree to basically be imprisoned, to give up your right if you had it, so we're talking 1820s here.
Virginia: So if you had the right to vote or hold office you had to give that up. You weren't allowed to marry and often you had to give up your children because the sort of theory at the time was that poor children could be kind of remediated by having access to middle class families. When they said having access they generally mean working for for free as domestic or agricultural laborers.
Virginia: So these were these brick and mortar institutions that held anywhere from a handful to thousands of folks for any amount of time from a couple of days to 30 years to the rest of their lives. They were pretty horrifying institutions, and the reason I use them as a sort of grounding point for the book and for these new technologies is they really represent this moment in our history where we decided that the primary goal of social service systems should be to decide whether or not people are deserving enough to get help so that they act as moral thermometers rather than universal floors that support us all.
Virginia: So that I think of as some of like the deep social programming, like the legacy code that is integrated into all of these systems is this political decision we made 200 years ago now that says the first thing to do is decide whether or not you deserve support rather than understanding some basic things as like human rights that everybody deserves no matter what they do. So that is baked in to all of these systems in some ways that I think are really, really important.
Ben: I appreciate you going through that. One thing I would say too is I think that emphasizes the point that we were making earlier is that there is a tendency to think that technology is this universally positive force that basically improves things almost inexorably, whereas really I think what you're getting at with that analogy, and I appreciate you going through that because it really says that a lot of times what we're doing is we're repackaging and we're re-implementing the same old ideas. Unless you actually take the time to re-examine that you're repeating the same mistakes, just putting a different label on them.
Virginia: Yeah. So the technologies that we're integrating into these systems actually are making political decisions for us, but often because we think of them as administrative tools we don't think of them as having important political consequences, but they embody all of these really important decisions. So just to give you like a concrete example, to make that concrete, one of the cases I talk about in the book is this attempt to automate and privatize all of the eligibility processes for welfare in the state of Indiana.
Virginia: That happened in the mid-20 aughts. So 2006 to 2009 roughly. If you look at the contract between the state of Indiana and IBM, their primary contractor on this project, the metrics in that contract are so fascinating and troubling because the metrics really only have to do with the efficiency of the system, that how quickly calls are answered, how quickly cases are closed. There's no metrics about whether or not the decisions made by the system are correct, right?
Virginia: Whether or not you're actually getting the benefits you're eligible for or getting benefits that you aren't eligible for, although they did have some language about error rates that would have a problem if you were getting stuff you weren't eligible for. It's certainly, there are no metrics about outcomes, about like how are you doing citizens of Indiana after this system has been rolled out? Is it improving your lives? So one of the sets of decisions we're making by signing on to that kind of technical system with those metrics, the decision we're making is that the most important thing we can do is reduce the ineligible and identify fraud.
Virginia: So that's really about efficiency and cost savings, but it doesn't do anything for our other political values, so it doesn't do anything for self-determination or dignity or due process or justice and equity. That's a political decision, saying that the only values that matter when we're making these political decisions that have huge impacts on people's lives is how efficient we are, is whether or not we've optimized the process. That's a political decision.
Virginia: So we tend to sort of bake these political decisions into these technological systems and then we pretend they're not political decisions. Then we're surprised when ... You know, I actually maybe believe that the Daniels Administration was a little bit shocked when they denied a million applications in the first three years of that process, right?
Virginia: I don't know what was in Governor Daniels' heart so I can't say what his intentions were, if his intentions were to actively divert people from that system, but certainly we could have guessed if we had done any work like with the policy history of welfare that diversion would be the result, and diversion that was faster and more efficient, more efficient at making sure that people didn't get the benefits they were entitled to and needed to keep their families healthy and safe. Because that's the way the policy around welfare has worked for 60 years, so it shouldn't surprise us when it's just, it's faster and with more math.
Ben: So Virginia, I want to ask you one other question. We'll get back to my own second question before, because you bring up an interesting point is you talking about what's in people's minds, what's in their hearts when they're doing these things. I mean, that was one thing I remember I asked Kathy about in her specific instance, why do people actually create these systems? Sometimes one of the things I've seen and you're coming out, sometimes they work for good intentions.
Ben: You see this a lot is that people actually come up with systems because they're actually trying to change things. I think what was interesting in your case is you seem to be focusing very much more on the social welfare aspect of it. Do you feel like most of the cases where you've seen these being implemented, is that usually what's happening is that it's really more about cost savings and efficiency kind of overall and that's what's driving a lot of it or do you feel like people have, where they're like they're trying to correct injustice, they're trying to correct inequality, but then they end up exacerbating it just because of the systems? What have you seen both in the book and after?
Virginia: Yeah. So I'm definitely more interested in impact than I am in intent. I think sometimes we get stuck on intent because we have this very limited way of understanding bias, that we only see bias as like irrational thinking by an individual about a group of folks, a group of marginal people. It doesn't allow us to get to these sort of structural and systemic issues around bias
Virginia: So one, I think it's really important to start with impact and not intent. I think we can get stuck down some dead ends if we are just trying to figure out what Governor Daniels meant, you know? Not what actually happened to folks. But what I've seen in my reporting is that there are certainly some cases where it seems like these systems were amplifying sort of hidden political motives.
Virginia: I think some folks in Indiana would say that about the automation experiment there. So I had one source say if we had tried to build a system to divert people for public assistance on purpose, it wouldn't have worked any better than the one that we got. So he thought you could make some kind of guesses about intent. But what I did really consciously in the book was to sort of start people with in some ways that Indiana case is sort of the easy case. It's a narrative we're really used to, right?
Virginia: It's sort of like a political motive, a potentially greedy corporation stepping into the sort of gap and a contract that was poorly thought out like a bad vendor process and then horrible effects for poor unworking people in the state of Indiana. That feels like a story we're pretty comfortable with. Don't necessarily agree that that's what happens to everyone, but we know that story.
Virginia: The next two cases I tell in the book, the case about the coordinated entry system in Los Angeles County and the Allegheny Family Screening Tool in Allegheny County, Pennsylvania, which is where Pittsburgh is, those are different kinds of stories, so in both of those cases the administrators and the designers I spoke to are all incredibly smart, incredibly committed folks who care deeply about the people their agencies serve and who have fantastic intentions, who have actually done like everything that critics have ever in decision making have asked them to do.
Virginia: So they are transparent, not 100%. I'd say like 75% transparent. They don't release everything but they release lots of things about how the inside of their systems work. They're accountable because they're holding this tool, either it's built by a public agency and it's held in a public agency or it was built and held in a public/private partnership, so there's an accountability there. They've been really thoughtful about that.
Virginia: They've even engaged in some kind of human-centered or participatory design, so that's literally everything that most critics of [inaudible 00:27:16] decision making have asked them to do. So the reason that I included them is because those cases lead us to ask much harder questions about if we're doing everything right, why are we still producing systems that from the point of view of their targets still police, profile and punish poor and working class communities?
Virginia: So that was really important to me is that actually this is not, if I wanted to write a really scary book about the worst algorithms I could find, it would have been a different book. There are much worse cases. These are actually some of our best systems, not our worst.
Ben: Interesting.
Virginia: So your original question was about how folks talk about why they do what they're doing, and one of the things that's really important in both Los Angeles and in Allegheny County was this idea that having more data and being able to analyze it more quickly and being able to predict what might be happening in the future allows folks to identify biased decision making and address it.
Virginia: That's something we really want to do because the social service system for sure has been beset by racial bias for decades and it has had profound impacts on individuals and on whole communities of folks. So that's real. The concern though is that these systems tend to understand bias in that way I mentioned earlier, that it is only like the irrational thinking of individuals and it's not terribly good at identify systemic and structural bias.
Virginia: So I'll just give you a quick example. In the Allegheny County case part of the reason that they've built this tool is to be able to identify discriminatory bias decision making on the part of frontline case workers. These are workers called intake screeners who receive reports of abuse or neglect from the community and decide which ones to screen in for a full investigation. So they have an issue with disproportionality in their Child Welfare System in Allegheny County, like basically every county in the United States.
Virginia: Something like 38% of the kids in foster care in Allegheny County are black and biracial and only 18% of the youth are black and biracial in that county, so there's about twice as many folks who should be in the system who are people of color. So that's real. They have a real issue with that. They've actually made some real progress over the last couple of years on making that disproportionality better.
Virginia: So they see this tool as being a way to identify where some discriminatory decision making might be happening within their own agency. The problem though is their own research shows that the great majority of that disproportion actually enters the process at the point at which the community calls on families, so that's what's called referral, rather than at the point where these cases are screened, screening.
Virginia: So the community calls on black and biracial families three and a half times more often than they call on white families. Once the case is in the system, there is a little bit of disproportionality that enters inside the agency so screeners screen in 69% of cases about black and biracial children and only 65% about white children, but that's a much smaller problem than this 350% problem that comes in at call referral.
Virginia: But what they're doing is addressing the data amenable problem and not the actual source of the problem. I think that that, spending our time and our resources and our smarts addressing that 4% problem rather than that 350% problem, I think that's an issue. Like, that's about like having a solution and then going and looking for a problem. It's not necessarily what the real problem is.
Virginia: The real problem with call referral is that it's a cultural problem that we have this very mistaken and dangerous idea in the United States about what a safe and healthy family looks like, and that family is middle class and white. That's why folks get called on more often. That's not necessarily a data amendable problem. That's a bigger different kind of a conversation.
Ben: A lot of more difficult problem.
Virginia: Well, I mean yes and no. I would love to see us spend the kind of time, energy and resources addressing that problem as we do with these like wicked and interesting problems that are data amenable problems.
Ben: Because in some sense by having the tools you can actually introduce bias just by what problems you focus on. That's interesting.
Virginia: Absolutely, and one of the things that's happening is if you remove discretion from call screening you actually remove a block to the incredible amounts of disproportion that are coming in from referral, right? So you're actually removing what might be a solution to that problem by replacing those people who are not perfect but are at least representative of the community they're serving, right?
Virginia: This is the most diverse, racially diverse, the most female and the most working class part of the Child Welfare Services workforce, the call screeners. You're replacing their discretion with the discretion of an international team of economists and social scientists who built the model. I think that's actually really important for people to understand because when you talk about bias in these systems we often say like well, removing discretion means removing discrimination.
Virginia: I have a very smart political scientist friend named Joe Soss who says, "Discretion is like energy. It can't be created or destroyed. It can only be moved." So what I challenge people to do is to not think about this as removing discretion, but think about it as moving it. If we're moving it away from these frontline workers, who are we giving it to? Do we trust them to be less biased than the frontline case workers? I think it's a really good question to ask anytime you're looking at one of these systems.
Ben: Yeah. That's pretty profound. I appreciate you going through all that because I think you help put a lot of color on it. So going back to the earlier question I asked about, so the book's been out for a little while now, you've been talking to people, having conversations. What's changed in what you're seeing and what's changed for you in terms of like how you think about the material now that you've been out there interacting with people about it?
Virginia: Yeah, it's been such an incredibly interesting process. Like I often make this joke now when I'm doing talks that I feel like everyone should be forced to tour a book for a year before they're allowed to print it because definitely my thinking is a lot clearer than it was even when I was finishing the book. So I'm learning to think of books as moments in time rather than the final word on anything.
Virginia: There's been a lot of things that have been really interesting. I mean one of them is that I've been talking a lot to foreign press and one of the things that's so interesting about talking to foreign press is I have to do so much explaining about how our social service system works for it to make any sense to them at all.
Ben: I read a British article about your stuff and I saw exactly that. I'm like yeah, they were explaining all the concepts. I'm like oh yeah, I guess that doesn't make sense, does it?
Virginia: They do not get it. So like even the simplest things that like I don't think need explaining, things like you have to prove your eligibility for medical care, like that outside of the United States makes no sense to anyone. Like just that sentence. They're like, "That? I don't know how to parse that sentence. What are you talking about? Why would you have to prove you're eligible for healthcare? Everyone gets healthcare."
Virginia: I have to be like, "Oh, right. No, that's actually not how we do it here. Let me spend 30 minutes describing why and how we got there." So that's been really, really fascinating and the reason I think it's been so important for me is that it continually reminds me that the way we do things here, there's nothing natural or inevitable about that. We can choose to do it differently, so in tons of places around the world the kinds of things I describe in the book, like not having enough food to feed your family, like living in a tent on the street for a decade, or like losing your child to foster care because you can't afford a medical prescription, those things are seen elsewhere as human rights violations.
Virginia: The fact that we see them increasingly in the United States as systems engineering problems really should make us very concerned about the state of our national soul, about sort of the state of our commitment to caring for each other and to working as a political community to provide a just basic minimum a line below which nobody is allowed to fall. So that's been really, really important and actually has made me really optimistic and hopeful and has sort of allowed me to see lights at the ends of these tunnels that we don't have to do it this way. We can make different choices.
Virginia: The other thing that's been really interesting for me is like around audience. So I thought very much that I was sort of speaking to two specific audiences and one was folks who have experience as targets of these systems, right? Folks who have firsthand experience of this stuff because often we think we're the only person ever to experience this. We don't talk about it because there's often stigma attached to it.
Virginia: I actually think it's really important for people to have their experience confirmed, to say like, "Oh, it's not just me. These other 30 people she talked to had very similar experiences to me." I think that can be incredibly liberating and it can be a really important place to start building a political identity that you can organize around. So that was very much part of my intention. Then the other group of people I thought I was talking to were data scientists, were folks who build these systems and think about these systems.
Virginia: I definitely have engaged with those folks, but the people I didn't think of and actually the people who have sort of forced themselves into my consciousness the most, which I'm very grateful for, is actually organizations who are on the ground helping people meet their basic needs or keep their families safe who are seeing these tools coming and are even actually often asked to consult about the tools, but don't necessarily know even what questions to ask.
Virginia: So they'll get you know, the Administration for Children's Services in New York City will approach an organization like The Bronx Defenders that works with parents who are child welfare involved and they'll say, "Okay, we're moving some systems of predictive analytics. What do you think?" The organizations will be like, "I don't know. Can you tell us some more?" They'll say like, "Well, you know, we're using a random forest model." They'll be like, "Okay."
Ben: What does that mean?
Virginia: Yeah, so there's often like a miscommunication. I don't think it's necessarily intentional on the part of agencies like ACS, but it does create real important friction and misunderstandings. So one of the things that's been happening since the book has come out is that organizations are starting to reach out to me to say like, "Just can you help us frame how we ask these questions?" That has been something that's been really, I think is really, really important work and is work that I've been really excited to engage in.
Virginia: So the kinds of questions we're sort of starting to develop are things that get beyond transparency, accountability and participatory design, so they're things like is this system being proposed in the context of austerity, right? Of fewer resources rather than more resources? We can make a good guess that if they're being proposed in the context of fewer resources that they're going to act to create barriers for folks.
Virginia: That's a really good question to ask, not something I had thought about before. Another one that was recently pointed out to me from Shankar Narayan for ACLU Washington is like does the community have a right to say no? Right? If we just decide you know what? We don't care if it's accurate. We don't care if you think it's fair. We don't want it. Is there a mechanism for us to say no? If there's not, then that's not terribly democratic decision making.
Virginia: Then after that, is there a remedy if this thing goes through and it harms people, is there access to remedy? In many of the cases that I talk about in the book it's really unclear what that remedy is and the best people can hope for sometimes is to get expunged, is to like get out of the system, erase themselves from the system, but then there's no next step of saying this has harmed me and I deserve some kind of redress. That's something where I don't think we're talking about nearly enough.
Virginia: So those are the kinds of questions that have started to come up and I'm really excited about them and that conversation because that conversation feels like really politically important for us to be having right now.
Ben: Yeah. No, that makes a lot of sense. Well Virginia, this has been a fascinating conversation. I'm really excited to see what you do going forward and I appreciate you taking the time to be on here with us.
Virginia: Yeah. Thanks so much for the conversation. I really appreciate it.
Ben: Absolutely. Thanks everybody for listening to Masters of Data. Check us out on iTunes or your favorite podcast app and look for the next episode in your feed.
Voiceover: Masters of Data is brought to you by Sumo Logic. Sumo Logic is a Cloud native machine data analytics platform delivering real time continuous intelligence as a service to build, run and secure modern applications. Sumo Logic, empowers the people who power modern business. For more information go to SumoLogic.com. For more on Masters of Data go to MastersOfData.com and subscribe and spread the word by rating us on iTunes or your favorite podcast app.
Page of

DESCRIPTION

Our guest today has been on a long crusade to raise awareness about how our digital tools are continuing and exacerbating the problems we already have around poverty and inequality. Virginia Eubanks is an Associate Professor of Political Science at the University at Albany, SUNY. She is the author of the book “Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor”.