Are jobs requiring high levels of human interaction worth preserving in the age of automation? Can we design machines to achieve something profound – the mutual recognition that occurs when human beings truly "see" each other? CASBS faculty fellow Mitchell Stevens explores these questions with Allison Pugh, author of the 2024 book "The Last Human Job: The Work of Connecting in a Disconnected World." Pugh launched work on the book as a 2016-17 CASBS fellow.
Are jobs requiring high levels of human interaction worth preserving in the age of automation? Can we design machines to achieve something profound – the mutual recognition that occurs when human beings truly "see" each other? CASBS faculty fellow Mitchell Stevens explores these questions with Allison Pugh, author of the 2024 book The Last Human Job: The Work of Connecting in a Disconnected World. Pugh launched work on the book as a 2016-17 CASBS fellow.
ALLISON PUGH
website | Google Scholar page | Interview with Allison Pugh on building a society of connection (CASBS in partnership with Public Books) |
Princeton University Press page for The Last Human Job
MITCHELL STEVENS
Stanford GSE faculty page | Stanford profile | CASBS page | Google Scholar page |
Narrator: From the Center for Advanced Study in the Behavioral Sciences at Stanford University, this is Human Centered.
We intuitively grasp the value of being seen and deeply understood by others, in our interpersonal relationships, and sometimes with professional therapists and religious figures in our communities and lives. Being seen and understood confers dignity, inspires meaning-making, and promotes reciprocity. It's such human connections that underlie a lot of the work that powers our economy.
The importance of so-called connective labor, often invisible or taken for granted, has been undervalued and underdeveloped. But given the emergence of artificial intelligence and other labor-saving or labor-replacing technologies, are these often precarious jobs requiring high levels of human interaction worth preserving in the age of automation? Today on Human Centered, a conversation with Allison Pugh, Professor of Sociology at Johns Hopkins University, 2024-25 Vice President of the American Sociological Association, and the author of the 2024 book, The Last Human Job, The Work of Connecting in a Disconnected World.
It's a project Allison launched as a 2016-17 CASBIS fellow, and as you're about to hear, her CASBIS year was critical in the formation of a key part of the book's argument. In it, Pugh elucidates the concept of connective labor, work that relies on empathy, the spontaneity of human contact, and mutual recognition of each other's full humanity. Helping her tease out the book's central themes for us is Mitchell Stevens, an organizational sociologist, professor at Stanford's Graduate School of Education, and a 2024-25 CASBIS faculty fellow.
This is yet another inspired curation of conversation partners on our show. Stevens superbly engages both the book and Pugh in exposing what Stevens calls the quietly radical nature of connective labor, why relational work is its own utility, not just a means to some other end, and how Pugh provides a lens for viewing the macro-level implications of citizens who do not feel seen in society, which leads to polarization and social fragmentation as seen in the US and elsewhere. But much of the conversation relates to whether AI can or should replace the human gaze.
If connective labor enriches the lives of individuals and, at its best, helps bind communities together, can we design AI technologies that enable machines to do the work of connecting with us on a reciprocal socio-emotional level? And is it possible to mechanize something profound, namely, what happens when a human being truly sees another human being? What you're about to hear will likely advance your thinking on the matter, so let's listen.
Mitchel Stevens: Allison, so nice to have you back at Stanford.
Allison Pugh: It’s great to be here, Mitchell.
Mitchel Stevens: I’ll ask a question that I know you’ll have the answer to. What is The Last Human Job?
Allison Pugh: So, The Last Human Job is actually a lot of different jobs. But what it is, is it's a kind of work that underlies many different jobs. So, I was interested in this kind of deep seeing of the other, that takes place in therapy, it takes place in teaching, it takes place in primary care.
But not just those jobs. It's actually important across a huge span of the economy. It's lawyers do it, managers do it.
What I'm really talking about is this knowing the, seeing the other, reflecting the other back to the other person, and just having that person feel witnessed in some way, feel known in some way. I came to think that that powers the US economy and the economy of other major industrialized nations. I considered it important enough to give it a name.
So, I called it Connective Labor. I tend to think you're not going to really focus on something unless it has its own name. So, that kind of Connective Labor, that underlies so much of what we consider worth paying for, so much value from how does a kid learn algebra, to how does someone learn to control their diabetes, to how do they become a better soccer player, across a wide range of economic activity.
Also, it felt invisible, taken for granted, often considered soft skills at best, kind of women's natural essential nature, if at worst maybe. I would say some people have called it, in the business world, some people have called something like that, you know, kind of EQ or emotional intelligence. But for me, the give and take, the kind of reciprocal nature, the way this is kind of co-constructed by both parties was really important and not really captured by those existing terms.
Mitchel Stevens: And one of the things that's lovely about the book, I think, is how you depict many different kinds of connective labor. Chaplains and support groups, grocery store checkout clerks, people who have experienced substantial medical treatments. And so the book is alive with lots of different examples of this kind of work and how varied it can be.
How did you come into this project? What's its origin story?
Allison Pugh: Yeah, it started with in 2012, I published an article about in-depth interviewing. And in-depth interviewing is how I do my job. It's how many of my books have been written.
And in those moments, I'm sitting down with essentially a stranger and I'm doing my best to really elicit their truth and reflect it back to them so that they can correct me if I'm getting it wrong. And it's an hour of being really closely observed and reflected back to witnessed. And they often end that hour or two or four or whatever it is saying, wow, that was just like therapy.
And yet in 2012, I was writing this article kind of defending it against what I consider to be a kind of misguided analysis of what interviewing is good for. There was this kind of small movement within sociology by some folks that were saying, well, no, what in-depth interviewing gets you is just ex post facto rationalization. Instead, we need survey research because that's going to get people's fast thinking.
And fast thinking is what people, shows what people really think. As opposed to in-depth interviewing, you're getting people's slow thinking where they're like massaging things so that it looks right or rationalizes how they feel. And I was appalled that I wrote a whole article which has since actually become, it's my second most cited work, it's people come up to me in conferences and say thank you for writing it.
It's like made this huge splash. And in it, I basically say in-depth interviewing is not stenography. We are not just writing down what someone says.
Instead, it's what I came to call connective labor. It is a kind of in-depth moment of seeing the other and being seen in which you are co-constructing, co-constituting meaning that is ultimately produces research or is part of the research process. And so that's how I started thinking like, huh, what is that work?
But at the same time, the problem or the question that the survey researchers were getting at, which is how do we scale that work up? How do we make it so it's not, make this personal idiosyncratic, emotional, particular work and get it so that we can get it, systematize it so that we can make it available for many people.
Mitchel Stevens: At scale, as we say.
Allison Pugh: Exactly. That's a question that actually is a good question. So it motivated me to start this research project that ended up with The Last Human Job.
Mitchel Stevens: So connective labor is both a meaning-making and a relationship-making process.
Allison Pugh: Beautifully said. That’s exactly right, Mitchell.
Mitchel Stevens: Jointly. So then a couple of years later, you come to CASBS. How did this place inform or shape the work that you ended up producing?
Allison Pugh: It had a huge impact. The book would have been totally different if I had written it somewhere else. I think if I had written it at Virginia where I was then employed, or if I had written it at any of the other residential fellowships, I would have gotten the first part, which is what is connective labor?
Why does it matter? What does it do? How do people do it?
Which is the bulk of the first part of that book. But I really would have missed the AI portion, where are people going with this? How do we scale it up using technology?
Because I was here in the belly of the beast, as they say in Silicon Valley, where so many engineers were doing socio-emotional AI. Socio-emotional AI was the fastest growing field when I was out here in 2016.
Mitchel Stevens: And make sure we know what socio-emotional AI is. What are you referring to when you use that phrase?
Allison Pugh: Yeah. AI that isn't just about pattern recognition to produce new antibiotics or something like that, or seeing what a dermatologist might see, identifying cancers or something like that, but actually intervening in relationship, I would call it relationships between humans, seeing the other by machine.
Mitchel Stevens: Trying to create machines that will see others.
Allison Pugh: Correct. So I spoke to actually engineers and talking to engineers ended up being a big portion of this work. So I talked to a whole bunch, many of them kind of brokered by Stanford faculty that I was meeting at CASBS.
And I really got a sense of what engineers think about this work, why it's important, why they do it, what they think it's contributing, how they sell it to the public, to their funders, etc. So I got a handle on that and, yeah.
Mitchel Stevens: And your book offers a compelling and lucid argument, but it's also a fairly long and nuanced argument. So if you had two minutes to tell these engineers what they're getting right and what they're missing, what might you say?
Allison Pugh: That's interesting. The book is an argument, and the book isn't really aiming at engineers. The book isn't aiming at actually the wider reading public.
Trying to highlight this work, this connective labor, talk about why it matters and how it's done. Talk about how we're organizing it now and in ways that actually currently degrade those jobs, and listeners will completely recognize this.
Mitchel Stevens: Yeah, give us an example, maybe.
Allison Pugh: I talked to many doctors as part of this work, and the doctors, I wasn't even going to include doctors among my sample, but they actually pulled me over and said, you need to come and include us, because they were experiencing this degradation of their connective labor, is how I would say it now, at that time, with the advent of the electronic health record and the spread of the electronic health record. So they were being taken over, their own primary care physicians who were evangelists about connecting to the other and how that's how they motivate people to make better health decisions, et cetera, and that's how they diagnose people. It's so important.
And it was being ruined, shrunk, kind of pushed aside by the demand for their data entry into the electronic health record. So that is like a good example of-
Mitchel Stevens: The recording and quantifying of the work rather than the connective work itself.
Allison Pugh: Exactly. How employers were kind of imposing, because of the fad of data analytics, were kind of imposing regimes of counting and scripting onto this connective labor work, and essentially degrading this work so much, that the next step of automating it in some way, came to feel like either a relief or not that big a deal, because the jobs are already kind of ruined on an extreme end. So that's the bulk of it.
Now, what engineers get out of that, I'm not sure. My, I have opinions about where they should be spending their time, but it wasn't really aiming at that.
Mitchel Stevens: I understand. But you understand, I'm swimming in the water out here. And part of what I found myself thinking as I was reading the book is, is the, is it the bottom line that socio-emotional AI is a fool's errand?
Should we, should we stop this bus and get on another? Or is there a way in which this activity can humanely and positively continue on your view?
Allison Pugh: That's the crux of it. That's a really good question, Mitchell. I'll start with, with, I'm not going, I'm going to answer that a little bit later.
Mitchel Stevens: All yours.
Allison Pugh: The first thing I'm going to say is, the way AI is sold to us, it's like engineers are trying to kind of, they're throwing everything at the wall and seeing what sticks. And the three things-
Mitchel Stevens: That's what we do out here.
Allison Pugh: And the three things that kind of can, you can kind of recognize what they're, how they're selling it are, AI is better than nothing. AI is better than humans.
Mitchel Stevens: Better than no interaction at all is the first. Better than nothing meaning better than no connection.
Allison Pugh: Better than like the, you know, kind of, many engineers are actually motivated by this, I found. You know, they're not like nefarious actors. They're, they're thinking they're contributing.
And for many of them, they are motivated by, these are, you know, I talked to some guy at Northeastern, for example, who was making like an AI palliative care consultant, an AI virtual nurse, an AI exercise coach. He had, he had a lot of AI in what I would call connective labor jobs. And he was doing it because he's like, it's better than nothing.
These people, you know, low-income people don't get good access to palliative care consultant, might as well have one there for them with AI. So that's a big motivator and a big way they kind of hype it. And I, I would say my answer to that is that is doing the classic American step of using technology to solve a problem that we have created politically and that we could address by simply improving staffing ratios and kind of fully funding public clinics, public classrooms, etc.
Mitchel Stevens: A technical fix rather than a political fix.
Allison Pugh: Correct. That's what we're reaching for with this, with this angle. Then the second one was AI is better than, well, do the last one perhaps first.
And that's AI is better together. So better AIs and AI and humans are better together. And that's behind the AI will free us up.
Which is very common angle.
Mitchel Stevens: Hear that a lot.
Allison Pugh: You hear that all the time. And I think this is super interesting because I think it's, because on the one hand, it's saying AI will take care of drudgery. On the other hand, which I think is a fallacy, because I don't know how long you have been living in capitalism, but employers are not sitting around waiting for how can we make your job more meaningful.
They are instead, if your job can be done by machine, they let you go. And so I think that the free us up has a lot of naivete in it. But in any case, it also, the getting rid of drudgery is one way in which humans and AI will be working together.
Another way that's been posited is that AI will take care of the thinking, and humans will add the motivation, and they'll do more of the feeling or emotional work, because AI is a computer and humans are humans. And so it's a kind of specialization.
Mitchel Stevens: Humans will do what humans can do.
Allison Pugh: Exactly. Now, the problem with that is that most work is actually both thinking and feeling. It's a little bit of a fallacy to think that you could separate those out beautifully.
But I actually think that's a really interesting future because of its gender implications. Because, of course, women have been traditionally slated to do the kind of feeling work.
Mitchel Stevens: The listeners, the feelers.
Allison Pugh: Well, men have been more occupied in kind of technical jobs. And women's extreme gender disparity of income has been linked to, studies have shown, linked to their occupation, you know, how much they spend in personal to person to person work. And so, to switch that and make that be what humans do is really interesting.
And what's that going to do to like gender pay gaps? I don't know. And so, I think that's interesting.
But I also think that is it's wrong because it's based on a kind of false separation between thinking and feeling. And then the third way that people talk about AI., that people sell AI.,
is that AI is better than humans. Now, obviously, I'm a big proponent of what humans can do. But I actually find this argument the most powerful.
Because one of the ways in which people come back to me and say, connective labor is great, but they kind of offer me a strong critique is from kind of marginalized disadvantaged populations that find connective labor to be a site of great judgment, significant judgment, shame, vulnerability, misrecognition.
Mitchel Stevens: How so, Allison?
Allison Pugh: So I'm thinking I have a lot of examples because I started to ask people about it.
So for example, there was a doctor, I remember speaking to her who was a primary care physician, had worked for years to get a low-income kid and his mom to trust her. And then the mom was like, can you help me with his obesity? And she was just like, you need to stop having so many sodas.
And he started crying in the office and she never saw him again. And so it was like, I'm sure if we interviewed that kid, he has not forgotten that moment where she individually blamed him for something that he did not feel like was, or maybe he felt like he was to blame or he had, there was all this shame in there.
Mitchel Stevens: This is what you call not being seen often in the book.
Allison Pugh: Exactly. And yeah, the loss of trust there. She's not alone.
To all, many people could recount for me, they were, I would say this to the people listening out there, the practitioners are as haunted by these moments of misrecognition as the people who felt them. So this is a problem of human to human connective labor. And this is where the way AI is sold, that AI is better than humans, is actually has a strong argument because there's a lot of robust research that shows that people find computers as not a site of judgment.
So like, you know, children will tell computers that they've been bullied. Adults will tell computers, I mean, more than humans.
Mitchel Stevens: Are more likely to disclose.
Allison Pugh: More likely to disclose, exactly. Adults will talk about their financial irregularities with a computer, not as much with a human being. Research done on people who are trying to give blood, and people who give blood have to disclose, I guess, or used to, I actually don't know the current practice, about their sexual practices.
And people will be more honest about those with a computer than with a human being. So people are feeling like computers are not judging them. And so they're more honest.
And honesty is necessary for good teaching. You have to know whether someone understands how much algebra they know already. Or a doctor has to know, do you ever eat a vegetable?
You know, like, shame is a real impediment to good outcomes in connective labor jobs. So this is, I think, the strongest argument that engineers have. And to any engineers that are listening, this is the thing that you have.
Mitchel Stevens: The candor that people have with machines.
Allison Pugh: Exactly. Now, at the same time, when you talk to practitioners, they're like, does this mean we should mechanize at all? Like, no.
The answer is, first of all, it's also based on another fallacy, which is, yes, computers are judging. They may not seem like they're judging, but they're judging. Every time that your insurance goes up or down-
Mitchel Stevens: Counting and scripting.
Allison Pugh: Like, they're listening to what you are doing, and then they're making consequential decisions based on what you tell a computer. So computers are judging. They just don't feel like they're judging because it's not a human gaze.
And part of the power of connective labor comes from the human gaze and the risk of judgment. And so that's the kind of secret paradox, and also the engine of the profound meaning that people get from this is from the risk of judgment.
Mitchel Stevens: So a kind of double-edged sword, right? If the interaction, if I fail to be seen in the interaction, that's a negative. But if I do feel seen in the interaction.
Allison Pugh: It's enormously powerful. And I had all sorts of stories about, in the book, from my research about what people got out of this. Like I had, there was a, you know, the both the worker and the other person being seen or whatever.
You know, I had examples where it felt profoundly powerful. For example, I opened the book with the chaplain who is in a hospital, and there's a man there and he's been intubated and he's furious and he doesn't want to be intubated and he can't speak and he can't express himself, but she can see the fury and she hands him a Kleenex box and says, why don't you throw this against the wall? And in answer, I think he felt so seen in that moment.
He was like, thank God, in answer, he grabs on her arm and he pulls her in and he holds her for 15 minutes. And then later on, she sees him and he's been extubated and he's not actually, he didn't die in the procedure and actually he emerged and he was like there is nothing like being seen by someone you don't even know in a moment that feels like the worst moment of your entire life. And it just really affected him and it affected her.
Like the capacity to provide that and to be that person for him in his dire straits was so meaningful for her. So it's like, that's just a little moment, but I actually have a ton of them in the book as I kept hearing these moments throughout and that's so powerful.
Mitchel Stevens: The engineer in me would say, can that be manufactured? Can we build an intelligence that would enable that kind of connection?
Allison Pugh: Well, that is the million-dollar question, and many engineers are wondering that. I would say, engineers told me, for example, that many apps that are in this space have problems of retention. That's a very common problem that engineers face.
Mitchel Stevens: By which is meant?
Allison Pugh: People, as I'm sure you have, and I certainly have, you download an app, you think, I'm going to do this every day, you do it for a week, and then you never touch it again. That's called retention, and retaining users is a big issue. So actually, how engineers have solved this problem is to salt the experience of using the app with a human being throughout, because they feel like they know this truth that they have derived, I think from actually the discipline of psychology, that humans motivate each other.
Now, it's true that humans motivate each other. It's a kind of core finding in what I've been talking about so far. But that's also quite reductive for what humans do, because humans do actually much more than motivate.
If we look at what the chaplain did, basically I found in all of my research that human beings help each other feel motivated. They give each other purpose. That's one area.
But they also give each other dignity. You could say that's what the chaplain did. Here was this man stuck in a situation where no one was paying attention to what he wanted, where he was trapped by the own demands of his own physical body and well-being.
He felt seen, and in that moment, he was given the capacity to be fully human. Being seen by another human being, that offers dignity. And then I actually spoke to many teachers that spoke most articulately about how being seen by another human promotes understanding and that children need that reflection to be able to figure out who they are and how they learn and all these important pedagogical lessons.
But that's also true across the life course and not just in the teaching realm. So when we're talking about what can be mechanized, people can feel seen by a machine. If you go to, actually, there's an episode of The Daily Show where Michael Barbaro is talking to Chat GPT just when it, they're talking about Chat GPT just when it was released.
And he types in like, I forget, something about like, what do you do when you're a perfectionist and how do you get out of it or something like that? And the chat and the bot returns like, oh, you know, often it's about being, you know, give some kind of therapeutic answer, analyzing like why someone would be a perfectionist, what's the impact and how do you get out of it? And he was like, whoa, I feel seen.
And he actually said those words. So it is certainly possible to be feel seen by a machine. The question is, does that have all these further effects?
Does it give you dignity? Does it give you purpose? Does it give you understanding?
And actually, and we can talk about this later, I actually think being seen by another human being has important community building effects, important impact on our democracy, important kind of not being seen, I think is behind so much of the trouble we're seeing in the populace.
Mitchel Stevens: Well, let's talk about that now. We'll go at that in a somewhat different way, though, which is one of the things as a sociologist like you, one of the things I learned over the last 10 years, interacting with engineers and computer scientists, is they talk a lot about these creatures called users. Users are the human parties at the other side of the app, whose needs, desires, and indeed psychology, need to be understood in order for the app to serve the user, as they say.
And one of the things that your book made me see is that if you think about human beings as users, then it's easy to forget the reciprocal relationship between the user and the interface, if it were. I mean, so it seems to me that often engineers are imagining that this is a relationship between a machine, they presume a relationship between a machine and a user, not a relationship between two human beings, or often in your book, some community or network of human beings. And I wonder if that might be part of the reason why the disattention to connective labor, which you make so obvious as I read it, is very rarely in conversations about design at present.
I just wonder if there's just perhaps a presumption that a human-machine interaction is a fundamentally different thing than a person-person interaction, which is really the part that I hear you emphasizing. I know that's not a question, so we can maybe just throw all of that away.
Allison Pugh: No, no, no. I can go with it if you want.
Mitchel Stevens: But go if you want.
Allison Pugh: I mean, I just think like human-machine interaction, it's a whole field.
Mitchel Stevens: Oh, yeah. Right. Exactly.
Allison Pugh: And it's not really what I am focused on, because I see it as inherently deeply dangerous. And I actually see the primary problem that we face as a humanity, as a society, is that we're not really recognizing the importance of what I am calling connective labor. Like we are, we don't care enough to even look closely at what human beings do for and to each other.
This question of what it is to be human has been answered a thousand times, a million times, and the answer is often, like in the book, I talk about something called an automation frontier, where people are like kind of negotiating what is to be automated, and what is too far, and what's reserved for humans. That's been really moving as frontiers do over say the last 100 years, or even 400 years really from like when, what is too much automation for us to stomach.
And, you know, most recently, this is still a very kind of active conversation, and most recently I found people are talking about like kind of creativity is what humans do. Leadership is what humans do.
Mitchel Stevens: We hear that a lot out here. Yes.
Allison Pugh: But I actually think, I think creativity, what is creativity? That's the question really. And creativity I think can definitely be done by machines, I think.
Like, you know, if creativity is, you know, the kind of merging of knowledge and practices here with something here to produce something totally new. You're describing what ChachiPT does, you know, it like kind of synthesizes what's out there and probably, you know, doesn't, it probably will get a lot better. You know, it may be not perfect what it does now, but that you're not.
Mitchel Stevens: Sometimes the new thing is a hallucination.
Allison Pugh: Yeah, exactly.
Mitchel Stevens: Sometimes it's an antibiotic.
Allison Pugh: And so I don't think creativity is the answer. I think what's the answer is what happens when a human being sees another human being. And by definition, that cannot be mechanized.
And when it is mechanized, something else is happening. And if the question, and I think our individualistic society is so focused on, I would say, the recipient of these things. They're so focused on the student, on the patient, on the kind of client or whatever.
And so the question is, well, maybe the kid can learn math just via a webinar. Doesn't need to be seen. Instead, it's just a download of information.
If it works, it works. Who needs a teacher? And I would say two things to that.
First, these things are deeply meaningful for the workers involved. So as soon as you involve the worker itself, then you're like, oh, actually, it does matter. But just even keeping with the, I would say, client focused or student focused emphasis of our society, there's something that happens in the exchange that's deeply profound and matters for the individuals involved and our wider community.
Mitchel Stevens: What I find sort of quietly radical about that is connective labor is the work that all of us do on an everyday basis with our fellow human beings routinely. It is the simplest, humblest, and most accessible of tasks. And you are suggesting, if I'm hearing you correctly, that that in fact is the last human job.
Allison Pugh: Exactly. That's what makes us human.
Mitchel Stevens: It's not the fanciest. It's not the sexiest. It's not the most glamorous.
Allison Pugh: Exactly. I mean, one issue I think is that people are afraid that they don't know how to do it or do it poorly. And one of the things I found is that people actually are quite forgiving, that despite the haunting, despite the problem of misrecognition, if you're trying your best and you're willing to listen to the other person say, no, no, no, that's not quite right.
It's actually this, then actually that's good enough. And it's good enough seeing is what we're doing. It's kind of, you know, going off of Winnicott from the, you know, mid century, mid last 20th century is like good enough mothering. This is good enough seeing. And that works. It's not doesn't have to be perfect.
Mitchel Stevens: You have these lovely parts of the book on mistakes and the positive value of mistakes in interaction. Can you talk a little bit more about that?
Allison Pugh: Yeah, I actually loved that also because it was a discovery along the way and didn't. And then after I started hearing about it, then I started asking everybody about the mistakes they had made. But I was talking to a therapist in the VA, and she described to me this experience where she had been kind of talking to somebody, and I guess had made a kind of misrecognition or recognition mistake, said something kind of wrong.
And the woman kind of gets up and is like, at the end of the session, gets up and says, I'm going to be kind of busy, I think, over the next couple of weeks, so I might not come back for a while. And then the therapist calls her in the middle of the week and says, I think I might have said something wrong, and I think I might have done that wrong. Can you tell me, you know, can you correct me?
Can you tell me where I got that wrong? Tell me what I can say, what I didn't get right about you. The woman comes back in and then the therapist was like, and then she made a huge amount of progress.
And by the end, it was like she asked her, what was the thing that actually you made so much progress? What was the thing that actually kind of worked for you? And she was like, it was when you said that you hadn't gotten me right and you were like kind of trying to hear, trying to be corrected, where I felt like, oh my God, she actually wants to see me.
And this is someone who was a veteran, used to like kind of mass institutions everywhere, being a cog in the wheel. And here was someone refusing to treat her like a cog. And that was just a very powerful moment for both people involved.
Mitchel Stevens: Mistakes create conditions for repair.
Allison Pugh: Exactly.
Mitchel Stevens: And when repair is executed thoughtfully.
Allison Pugh: Yes. And therapists actually know this up and down. They have a term for it. It's therapeutic rupture. And that's a rupture to the Therapeutic Alliance. And they kind of talk about it. They have a whole theory of it. But teachers don't really know it. Doctors certainly don't know it because they're so afraid of mistakes.
But when I could talk to them about, I'm like, I'm not asking about medical mistakes. Just tell me about kind of relationship mistakes you made. And they also are haunted by them.
But they also, some of them were able to tell kind of redemptive stories when they came back and said, you know, can you correct me? So that's actually a very powerful example of the kind of ways in which this process is really reciprocal and co-constituted.
Mitchel Stevens: As we sit here at the Center for Advanced Study in the Behavioral Sciences, whose constitutive purpose is to bring people together into scholarly conversation, I'm wondering if there is an affinity between the work of connective labor that you're talking about and the work of scholarship, which is fundamentally collective and arguably better when done in ongoing dialogue, so the extent to which scholarship itself is a kind of connective labor when it's done well, is that a reasonable affinity?
Allison Pugh: So great, Mitchell, what a great question. Yes, I do think that's right. It actually really nicely describes what CASBS does because I don't know about other people, but I, you know, before I come someplace for a sabbatical, I'm off in my little, you know, corner, just trying to eke out a few hours of writing time and that can be a very lonely experience.
But then you come to CASBS and you're talking to people over lunch or in your studio or whatever, about, you know, kind of what you're doing and they're really trying to hear what you're doing. And then they're like, have you read this? Have you thought about this?
And they're extending it, but they're not extending it without hearing what your goals are to begin with. So, yeah, so that you're, it's a beautiful application of what I'm talking about, yes.
Mitchel Stevens: And part of what I'm hearing you say also is, at least in the book, is out here, again, in Silicon Valley, with a lot of focus on measurements and bottom lines and utility functions. What I hear you saying is, in fact, the relational work, the connective labor, is its own utility. It is not just a means to an end. It is, in fact, an end in itself. And perhaps we undermine that end by always tying it to some kind of mean.
Allison Pugh: Yes. I mean, that is the crux of what I end up arguing. Because people, I talk to practitioners, doctors, etc., teachers who would say to me, yeah, connecting is really important. That's how I get people to tell me things, or that's how I know whether someone is ready to learn this lesson or whatever. So they were always tying it to, or frequently tying it to instrumental kind of objectives. And I understand that. I mean, that's why we pay a doctor or a teacher to have these objectives in mind.
But speaking as a sociologist, I was seeing all these kind of further deep profound impacts and thinking, that's just so much broader and deeper than the instrumental kind of small scale, still important, but objectives that they were pointing at.
Mitchel Stevens: And in fact, let's pan out a bit because I'd hate to have listeners of this conversation come away thinking that The Last Human Job is all about dyadic conversations and interactions. It has a much larger ambition, suggesting that there are substantial, if you will, sort of macro level implications for a society that kind of fails to attend to the mundane but ubiquitous importance of connective labor. So, if you could remind us what you see as some of the largest stakes in this conversation.
Allison Pugh: Yeah. Well, first, let me add parenthetically that I actually don't think the thirst, the yearning for being seen is biologically universal, historically, again, without time, universal across time or culture. I actually think it's very particular to our time and our place. And there are examples that, for example, we didn't even have the word empathy until like the early 20th century.
Mitchel Stevens: Oh, I didn't know that one.
Allison Pugh: Yeah, it was invented, I think it's like around in like 19, some early 20th century by some psychologists, I think German psychologists that meant something entirely different. And then came to its kind of current meaning around the middle of the last century. So if you don't even have a word for it, it's a little, it suggests sometimes…
Mitchel Stevens: This phenomenon of being seen is not a human, needing to be seen is not necessarily a human universal.
Allison Pugh: Absolutely. There's actually cultures, this great anthropological work showing that there are cultures in, I think it's, I want to say Asia, where you actually don't want to be known. Like you actually to be, they actually even have a word for someone who is too easily read, as if that's kind of a problem. And the word is like, it's a phrase. It's like the papaya, it ripens. Like you can see how ripe a papaya is from the outside, and that that's a problem.
And to me, that's so interesting because we have a culture in the United States and also in other, I would say in the West writ large, which are, which is about being seen and the yearning to be seen and the deep need to be seen. And if you are not seen, it's, it's causes all sorts of problems. And this is the higher stakes that you were mentioning where I think that a lot of the kind of fragmentation of our polity, the kind of problems that we've been having in our democracy. The polarization is about how some people are seen culturally and I would say on individual levels. And some people are not. And those who are not are, you know, kind of grievously angry about that. And that actually, I'm speaking about kind of, I would say, white working class men.
But also, I think that it powered some of the Black Lives Matter movement about, you know, kind of black people feeling misrecognized by the institutions that they were interacting with, police and others. And so it's like, these are not small concerns. And they are being kind of generated by the kind of uneven topography of connective labor.
Mitchel Stevens: Well, when recognition and misrecognition or non-recognition can be mobilized for political gain.
Allison Pugh: To be sure. Exactly. You know, this is like kind of a rolling sense of...
Mitchel Stevens: Disenfranchisement, invisibility.
Allison Pugh: Yeah. And being being unrecognized in a culture that valorizes recognition is is painful.
Mitchel Stevens: Yes. So here we are sitting on a hill adjacent to a pretty good university at ground zero of... We say things like human centered artificial intelligence all the time around here.
If you had advice for this university as an educator or a locus of critical discussion about these technologies and their forward development, what might you encourage this university to do and be?
Allison Pugh: Well, the first thing I would say is I actually think even though this is the belly of the beast, this is Silicon Valley, this is a code coding centric universe, I actually think they are also deeply concerned about these questions, and so they might be open to my supposition. There is a critical conversation about AI out there.
And it's about bias, it's about surveillance, it's about job disruption, and those are all important. But what we're not talking about is kind of relationship, and the impact of this technology and kind of these advances on relationship. But the question about like, what is human? What's left for us? That's a conversation that's really centered at Stanford. Many of the kind of primary actors in that field are here, asking those questions, I think, not answering them with enough consideration of these factors. But they're going to care about this question.
So that's the first thing. They're going to see its relevance. In my view, there's too much emphasis on, wow, we need to get kids into STEM. We need to get kids coding. So, I would have a statement for undergrad curriculum designers. And Stanford is the place to come for that.
So, I don't know whether they'll listen to this. But I do think that what kids really need, undergrads really need, is like four straight years of learning how to interact, learning how to connect, learning how to listen, to be attuned to the other. That's what they really need.
I actually think ChatGPT and its inheritors are good at coding, or they're going to be better. That's what they do already. So, we actually don't need more coders and don't need more.
Maybe I would say, learn how to code so you have some literacy, as some artificial intelligence does it for you. But what it's not going to do for you is see the other human being in front of you, and create this magic that produces not only deep meaning for you and the other person, but also our democracy. That's what needs to be kind of, those are the kind of skills that we need to emphasize for kids.
Mitchel Stevens: Which is remarkable also, Allison, because as a political economist of education, I often say that physical co-presence is by far the largest cost center of higher education.
Allison Pugh: But it's also the largest value center.
Mitchel Stevens: And my sense is that there's a deep connection between what you're talking about in that, which is the reason we oblige and expect our students to reorganize their lives to be in each other's physical co-presence, is so that they can do the kind of and practice the kind of connective labor.
Allison Pugh: I mean, exactly. If education is more than information download, then you have to be physically co-president. To the extent that it's just about downloading information, sit at home, you don't need anyone else.
But if it's about seeding something inside you that is going to respond to the other person and produce something together, that is a reflection of the two of you interacting or a community of people interacting, that's co-presence. There's nothing you can do about that.
Sounds like progressive pedagogy to me.
Allison Pugh: Exactly.
Mitchel Stevens: Allison, thank you. I hope you come see us again very soon and very often.
Allison Pugh: Thank you, Mitchell. I so appreciate, first of all, your close read, you know. There's something a lot like connective labor in being seen so well by someone who, you know, kind of is interested in some of the same questions. So I really appreciate your time.
Mitchel Stevens: You make it a pleasure.
Narrator: That was Allison Pugh in Conversation with Mitchell Stevens. As always, you can follow us online or in your podcast app of choice. And if you're interested in learning more about the Center's people, projects, and rich history, you can visit our website at casbs.stanford.edu.
Until next time, from everyone at CASBS and the Human Centered team, thanks for listening.