
In the inaugural episode of the Future Perfect Book Club, Ron Bronson and Joel Goodman delve into Craig Gent’s “Cyberboss: The Rise of Algorithmic Management and the New Struggle for Control at Work.” They explore how algorithms are shaping the modern workplace, often in ways that diminish worker autonomy and dignity.
Ron and Joel discuss the increasing disintermediation of human interaction in favor of algorithmic control, citing examples from ride-sharing apps to Amazon distribution centers. They examine the book’s anecdotes of workers experiencing isolation and a lack of communication, as well as the elevation of algorithms to a “godlike” status in organizational cultures.
The conversation touches on the allure of convenience that drives the adoption of these platforms, even in the face of known risks and ethical concerns. They dissect the implications of “just-in-time” logistics, the role of “subvisors” in the management structure, and the dehumanizing effects of algorithmic scheduling.
The hosts also analyze how these systems perpetuate capitalist structures and echo the principles of Taylorism, all while removing accountability and consolidating power at the top. While acknowledging the book’s left-leaning perspective, Ron and Joel emphasize the importance of understanding these trends and finding ways to disrupt them. They also share some laughs along the way, despite the episode’s heavy topic.
Chapters
00:00 Welcome to Future Perfect Book Club
01:16 Introducing “Cyberboss”
02:32 Synopsis: Algorithmic Management and Disintermediation
03:11 Anecdotes from the Book: Worker Experiences
06:15 Algorithms as gods
07:17 The Allure of Convenience Despite the Downsides
11:00 Trusting the System
14:24 Implications of Control Structures: Subvisors
16:50 Dehumanizing Worker Dignity
18:47 The App is the Boss
20:09 Compliance is built into the system
23:35 Capitalist Structures
25:37 A little bit about Taylorism
33:39 Wrapping up this book
38:44 Announcing the next book
Ron Bronson: I’m Ron Bronson.
Joel Goodman: and I’m Joel Goodman. This is episode one and we just read Cyberboss: The Rise of Algorithmic Management and the New Struggle for Control at Work by Craig Gent.
Ron, I’m so excited to start this podcast.
Ron Bronson: For those at home, Joel and I have known each other a very long time. And it’s a shock to anyone who’s known us a long time to think that we never had a podcast together.
Joel Goodman: Yeah.
Ron Bronson: reason is you want to stay friends so you don’t make a podcast. But then we finally got a good idea for one.
Joel Goodman: Well, in our conversations are always great when we’re together and we’ve always talked about how we should just be recording them anyway. And so I’m particularly excited to be reading the same book with you. And then talking about it because your background in, design and digital transformation and change management and civic work and education work and all kinds of other stuff, I think is super, super interesting.
And you know, I also want to talk about, your Consequence Design framework and how you think about how design affects everything that we do in our daily lives. Like I, I’m excited to be able to dig in and, and take these books that seem consequential, or maybe they’ve proven to be consequential cause we’re definitely gonna hit some classics at some point
Ron Bronson: We definitely are.
You know, so the one good thing was know each other really well as we know each other’s strengths and what we’re into. And so Joel, of course, beyond all of his work, as like a design systems and working in front end development for a long time, obviously a business owner, agency owner.
Joel’s real passion is actually semiotics and Media Studies. So Joel got me into a lot of that stuff. Like I didn’t really, it wasn’t really engaging with McLuhan back in like the 2015s when we were like kind of really becoming really good friends and, I’ve gone on my own journey now to deeply getting into more critical theory. But really it started with a lot of our earlier conversations around you being like, Oh, you should read this.
Hey, here’s this quote. And so I’m really pumped that we get to do a thing that allows you to delve into stuff that you’ve been talking about for a really long time. Like you didn’t just start on this train. You’ve been deeply embedded in this space for a long time. And I think that a lot of that thinking is super relevant where we are in the world right now.
And this sort of crossroads of all technology and obviously media and critical media converging in this really, frankly, sort of destructive way. frankly.
Joel Goodman: Honestly. We see it all the time. And, there’s, just so much going on and, I think we’re just going to hit tons of that in this series. I’m really, I’m really excited, so let’s kick it off, Ron book. Number one, you, you made the recommendation for it. We are reading Cyberboss by Craig Gent. What’d you think?
Give us a synopsis.
Ron Bronson: So for those at home, who have not quite read this book yet. Cyberboss is talking a lot about algorithmic management and how, platforms are controlling work in this sort of modern economy. And as someone, I got to spends a lot of his time thinking about, more about like local like cities and in the way atomization and thinking about the ways that algorithms are ruling the way that we live our lives.
I’ve been talking a lot about disintermediation. So things that used to be like Person to person are now being you know being interrupted by algorithms. Who knows who built it? We don’t know what they’re doing. And so this book gets into that and really gets in the weeds on it But it’s also kind of breezy like I told Joel I’m like it’s kind of short we can get through it and we did.
Um, what did I think?
I, you know, so for me, the parts that I thought were interesting, obviously, were like some of the confirmed things that I had known about related to things like, you know, a lot of the ride sharing, apps and things and the ways folks are leveraging these things. But also getting in the weeds of like how distant these platforms are from their workers.
So how there’s a guy in one of the, one of the chapters talking about how he got the job, he didn’t talk to anyone for three weeks. And it wasn’t until he missed a couple days of work that he got to talk to somebody. And it wasn’t even like it was a big deal. They were like, all right, you go back on the road now. Um, and so that, tripped me out a little bit.
Um, so lots of stories like that of hearing those anecdotes, I think I found the most valuable in it.
Joel Goodman: I found those stories really enlightening and I think they were definitely the part that keeps you a little bit more engaged. Because at some point, it shifts into this sort of dystopian outlook on life, right?
And you realize, uh, you know, Craig interviewed a ton of people for this study.
There’s a lot of research, a lot of interviewing that went, behind, creating this work. And he goes into Amazon distribution centers and, well he talks to people, he kind of went covertly into Amazon an, uh, an Amazon distribution center in the UK a couple of times, but talking to these, uh, these workers that are in the Amazon facilities or uh, major distributors for grocery stores or like you said, the, the ride share and, and food delivery folks, it’s so interesting the ways that the, the bad, I guess, or like what I considered the dystopian side of it, is kind of hidden.
I don’t know if it’s purposeful. I don’t know if it’s like just very enterprise thinking around it, but it was interesting to see the contrast of, this looks very dystopian from the outside, it looks very, computer controlled workers doing, you know, worker bee type of stuff without any kind of autonomy.
And, you know, at times talking about how the algorithms have, uh, Gotten so specific to the point where that worker only knows what their next step is. They don’t know anything else. They have no autonomy within the process that are there. At least the algorithms are trying to force them into not having an autonomy during their their work days.
That was interesting and in contrast, especially to how the people in those jobs respond to it and how they think about it and how, how they react to it in different ways. Because there is a part of this book that takes a critical eye to how effective I guess trade unions are in supporting these workers and getting them more rights.
Ron Bronson: Yeah, I, I chose not to engage with any of those parts of it,
Joel Goodman: It’s probably
Ron Bronson: because they, well, because so many of these books tend to go in that direction, and, and it’s, this is just a personal critique that I have, y’all can at me if you want about this, but so much of tech people, right now, it seems like their answer to everything is, oh, you should organize your coworkers, and I’m like, they’re, I’ve, most of the time I’ve been in unions have been like jobs that I had a part time when I taught at university, You joined the union.
Joel Goodman: When I was an intern at the Boston Globe when I was out of college, they made the interns join the union, like So it’s small situations, but like, it almost feels glib folks are just looking for some kind of systemic way to like push the issues of what’s really happening.
Joel Goodman: You know, my, my media studies, heart and semiotic brain really latched onto how algorithms have been elevated to this kind of Godlike status.
And there’s a whole section in here where, Craig talks about the relationship of how the internal cultures that these algorithms kind of create, correspond with religion in a lot of different ways. I hadn’t really thought about that.
And when you think about how, you walk around anywhere, you know, like I, I work with higher education institutions a lot, and as they’re weighing new technologies, they’re always thinking, you know, they’re the, oh, the algorithm, is greater. We talked to my friends that are social media managers, right? Oh, we have to control our algorithm.
And the algorithm ends up being the end all be all. We serve that algorithm in some way. And this is just on a very, specific, very insular, very consequential way of the algorithm messing with someone’s life, because it is their livelihood. It is how they’re going to progress in life, cover their bills, you know, that sort of thing.
And so, yeah, I wanted to, I want to kind of talk about that, uh, a little bit and kind of what your thoughts are on that, Ron, because, you you and I, you know, and everyone we know, we’re very drawn to the convenience of these platforms. But we also know these, these stories.
We’ve heard the stories of how, back in the day with, with Uber and Lyft, or Uber especially the, dangers of drivers, or the dangers of riders with drivers, them being potentially attacked or assaulted in some way. Also the, the pay issues around all this, and there, there are all kinds of weird sticky things all the way down that can be a little bit, tough to talk about.
Well, why do you think we’re just drawn to the convenience despite knowing some of those things?
Ron Bronson: Being the resident Trekkie of this conversation for the non trekkies, just roll with this, look it up on your phone there’s something very alluring about replicators. This idea that instant, just in time, magical, it shows up. And obviously there are other ways to sort of interrogate this, right?
But I think you think about the future. It’s just the robots are bringing you a thing. And how cool is it that I push a button and it’ll be here by 6 p. m. Oh my gosh, that’s amazing. Um, and so, you know, it’s the, the future George Jetson wanted for us. and frankly, I guess, you know, taking a less, a less, uh, cheeky stance about it, thinking about it from a pure convenience perspective.
You’re a, parent. you could save that hour a day. How can you use that hour? And if I don’t have to go to the store, you know, I want the same things, but I’m not concerned about where it comes from. I just need to have it so that it gives me the time to do other things because we are stretched more than we used to be.
So I, like, appreciate, I don’t judge anybody for Not accepting the world as it is. Yeah. We all grew up going to Kmart or whatever, if you live in the US but like, I don’t lament the loss of these experiences because frankly, the in store experience has been diminished so poorly. Like it’s not, it’s nothing really nostalgic about it to me in the current state, cause it doesn’t feel like it used to.
but I, so I think honestly, it’s just people are, people are stressed. They’re afraid they got a lot going on. And, and I get it. Like, I understand why having this proliferation of platforms that allow you to do these different, do the things you were going to do anyway. That’s the key. maybe you’re buying more, but at the end of the day, you were doing this before, and now it’s just easier to do it, irrespective of the externalities, right?
Joel Goodman: Yeah, for sure. I feel the same way. I think there’s, to your point about, you know, replication and, the instantaneous gratification of something showing up and it feeling magical and when, you know, whatever serotonin hit that brings us, we see that all over the place in kind of lighter, more shallow levels, right?
I think that’s the appeal of using, generative AI. Despite that, we know it’s destroying the environment and it sucks up so much energy and power in a lot of ways, like not knowing, kind of what has gone into it, and even though we do know it’s gone into training these things, like we’re still using it, right?
There’s this magic quality to it. But once you see how this sort of efficiency at all costs behind the scenes work happens, that’s where I get a little.
That’s where I get a little mad actually. And, you know, it goes back to this, the algorithm as God cultural argument, like it doesn’t matter what the consequences are in the long run or even in the short term, we have to serve that algorithm because that algorithm is the thing that’s telling us what, you know, the next song is going to come up on Spotify or Apple Music or Tidal or whatever. It’s the thing that’s going to suggest, the next meal from DoorDash, or it’s the thing that’s going to suggest, the next thing to purchase on Amazon. com.
And we have to feel like we have control over it, but, the reverse actually is happening. We’re actually in service of that algorithm because it’s all made to kind of subjugate our wants and subjugate our time.
Ron Bronson: For people, I guess, that are reading at home and thinking about the book, it’s actually, A section of the book talks about this in page 167. It says just trust the system. It’s like, it’s quite, quasi religious actually, uh, and talking about, how these things are everywhere.
It really is something people don’t really engage with, in a critical way. They just think of like the back end of it.
Joel Goodman: Yeah. Just trust the system. And that, that’s, that’s been a workplace mantra for a very, very long time. Right. And I mean, even as, a business owner, when I talk to, other business owners, or managers, or CEOs, people in those, positions, it’s just like, Well, just, just trust that you have a system in place, just trust the system and everything’s going to turn out fine.
But when that system becomes something that’s programmed and relatively autonomous, you can’t question that system at all because you don’t even know what the system is, right? At least as a business owner, I know what systems I put in place because I probably wrote them. Um, or I worked with someone to write them, you know, and collaborated with them.
Um, but when that system is, a mathematical equation and a system of software that is making predictive choices or very consequential choices, managing people’s time, um, you know, when they work, if they can work, you know, that sort of stuff. it’s a different thing because it’s detached from, it’s detached from humanity, right?
It’s detached from an actual person.
Ron Bronson: you actually have me thinking about a quote from page 17 where he talks about just in time logistics and “time is always money. It was in fact supermarkets enduring obsession with efficiency and minimizing waste, including the workforce, that sabotaged their ability to fill their own shelves.”
So we saw this during the pandemic, right, you know, with all toilet paper and all the, you know, different things that were not being, being, uh, provided, you know, obviously some of these were supply chain issues, but a lot of this is that stores have gotten so good at hyper, uh, efficiency and knowing what people are going to buy that they under buy what they need so that there’s not any waste so that they save, you know, you know, A few dollars on the back end, but this caused some severe problems in a, you know, once in a lifetime situation, certainly relative to the world we live in.
And so I really appreciated him digging into, I think this has been interesting trend. We’re going to see more and more books contending with the pandemic era, not that we’re out of it, but that era, we were all locked in our houses to really reflect on ways that the system failed because there were so many fissures in the system at that time that were not super clear to us before and it was really interesting for me to see, I think for a lot of us, to see sort of the, the breakdowns of those things that always just seemed to work
Joel Goodman: Hmm.
Ron Bronson: understanding exactly why and where those things were. So him getting into that was really interesting for me.
Joel Goodman: Yeah. we definitely, I mean, we, we, you see that during natural disasters too, right? I mean, we, my wife and I lived through the, the giant freezes that happened in Texas several years ago that kind of ruined the power grid. And it was the same sort of thing, no toilet paper, you know, everything’s gone, but you see that during any sort of, pre warning natural disaster sort of thing.
And that’s because these algorithms are built. this just in time fulfillment and distribution logistics type of stuff is built to work in precedented times in normal times, normal operation, right? There aren’t backups for I mean, it’s, kind of like the Great Depression.
There’s no backup for when there’s a run on the banks. I mean, there kind of is now, but like, it’s that same sort of thing. What do you, do when, a wrench is thrown into that system?
uh, you know, one of the things that I’m really, kind of stuck on is what the implications of these different control structures are. One of the big points that, he makes in the book is that at some point when the, algorithm is, kind of controlling everything, right? The managers, or the supervisors actually, I think he says the supervisors become subvisors because they don’t really have any control either.
There’s this layer between upper management and then everyone else, and you have, kind of a, two class system below the algorithm where it’s supervisors, which really don’t do anything, you know, they’re there as like, backup help maybe like, but they can’t really do anything anyway with that.
It’s just like, oh, algorithm said something and someone has a question. They can answer the question. Um, but mostly all they’re doing is serving the algorithm again, making sure that, yep, that person was clocked in for the right amount of time. They work the right amount of hours. They’re hitting their efficiency levels, but then even that they can’t do anything about those efficiency levels except for chastise and maybe do some sort of, workplace punishment for those for people not hitting those specific efficiency goals, right? That’s where it kind of scares me a little bit.
When you mentioned earlier, one of the people that, that he interviewed talked about how this is the job where he has spoken the least to anyone.
Like there’s just no communication. And I think that’s a direct consequence of the algorithm being the thing that is running everything. It, detaches the actual person from, anything else, right? Like it just, it sucks the humanity out and they become workers. I think it was in this book.
It might be something else. I think it was in this book. Um, he talks about how it’s just become more affordable to hire people, because they’re disposable, than to put robots in place doing this stuff. Cause some people will ask that, right? Like, well, why doesn’t Amazon have robots that are picking these things and packing them up for distribution?
It’s because you then have to hire someone that is skilled to fix that robot, and they’re going to cost more, on top of the robot than just hiring a bunch of people that don’t need any special skills and replacing them when they burn out or quit, or aren’t hitting the efficiency levels that you need them to.
It has nothing to do with humans whatsoever. It’s only about how useful they are to the corporate structure.
Ron Bronson: Yeah, I think you bringing up that dehumanizing worker dignity piece of this, which he talks about early in the, early throughout the book, really, right? Talking about workers or working conditions. Do you think that people really have contended, like, everyday people have contended with the, the downstream effects of these platforms on the people who are doing the work.
Like, based on what I see online, that is not the case. Like, people on Reddit do not. Like, why did my, why did my doordash slave bring me my food faster? I’m gonna, I’m not gonna tip him anything. In fact, he should be paying me for the food.
Joel Goodman: No, I mean, that happens all the time. And I like, I, and I think, uh, even from that perspective, I think the, I think our reliance on these sort of technology and algorithmically driven, delivery type services, you know, there are other things with that, but like, delivery services or digital, digital experiences, it affects how we interact with other industries as well.
Like, you know, I mean, when you are, when you feel so entitled to your, cheeseburger showing up on your porch, even though you know it’s going to be cold and it’s not going to taste as good as picking it up from the drive thru or whatever. that filters over to when you go out to an actual restaurant and how you treat the hospitality staff that are working in person and are working hard.
and they’re in jobs that are supposedly more humanizing, because the majority of the hospitality industry is not run on, algorithmic scheduling and things like that yet. Um, and I, there’s actually, I think there was actually a comment, uh, in the book one of the delivery drivers said, if, you know, if I was working at a bar or restaurant, like, sure, they can cut me early, but at least I know why I was cut early, I know that there wasn’t the traffic
Ron Bronson: yeah, you know, I, um, you saying that reminds me of the times I’ve been in rideshares, which frankly, I think rideshares are terrible, it shouldn’t exist, but, I mean, you know, but I also use them, and one of the things I like to do when I use them is to ask the drivers often, you know, “how do you like this?”
Like, “talk to me about the app a little bit,” like, I just love knowing more about this because you’re being, again, your boss is the app, right? And at
Joel Goodman: this
Ron Bronson: point it’s so sophisticated. In the old days, it was much less sophisticated. It’s so sophisticated now that they really do work for this app.
And, and it’s interesting because across the board, they all like it because it gives them a chance to work when they want to, and they get paid and bonuses, and blah blah blah blah blah. And I’ve heard all kinds of stories at this point about you know bonus rides and but my favorite one is I’ve had this happen more than once in different cities all over America where the driver is done for the day and doesn’t want any more rides. Because the app will give you rides and I guess you get docked for turning them down.
And say, “hey we’re almost at your spot do you mind me turning, I’m gonna I’m gonna say you’re there already. I’m going to take you there, but I turn off, I just want to stop getting offers. Cause what’s going to happen is it’s going to take me, I’m on my way home cause of where I’m taking you, but if I’m not careful, it’s gonna take me further away from home and then it’s going to cost me money basically.”
And I’m like, no now, I know. Cause it’s happened enough times that I know when I get asked this, this probably happened five or six times, which is like, Hey, you mind me? I’m like, no, you’re good, man. I get, I know what’s going on.
And they’re like, Oh, thank you so much. Like.
Joel Goodman: yeah,
Ron Bronson: But it’s wild, because again, to your point, if my boss was you, “Hey Joel, uh, I’m done for the day, I’m gonna clock out and go home.”
“Alright man, see you tomorrow, good job today.”
Joel Goodman: Yeah, right.
Ron Bronson: You can’t talk to your robot boss about that, because your robot boss doesn’t care about, doesn’t know you exist as a person.
Right? It’s a video game.
Joel Goodman: Yeah, it’s built to give you that next step and to dock you if you don’t, comply, right. It’s, it’s all about that compliance. It’s well, so this is interesting. I wrote down a note when I was reading and I think it, like, I think that’s a good example of specifically the ride sharing drivers, but like there’s kind of two sides of this.
There’s, algorithms like enhancing control, but the same side or another side of the same coin, I think is. It gives you also an illusion of control, right? So like with those drivers, yeah, they kind of gives them more control over what they’re doing.
But at some point that tips into it’s really only a facade, you know, you, think you’re in control and that you’re running your own business because you’re an independent contractor, you know, whatever else, but at some point, where does it, that become detrimental, and how do you, how do you as a worker cope with, figuring that out? You’ve got to figure out kind of what that line is.
And the problem is that the algorithms can, and when the people in charge of the algorithms as well, you know, like they’ll notice that, Oh, well, efficiency is going down. Someone’s figured this out. We can change the goalpost. We can move that somewhere else. So you don’t know where that line is at anymore.
And that’s one of those tactics that happens in these distribution fulfillment centers, right? Where you have a quota where you’re supposed to get a certain amount of work done. And you may or may not know what that quota is. And I would argue that the bad companies are the ones that don’t tell you what that quota is. but the point is that if you aren’t hitting that quota, you’re going to be told that you’re not hitting your quota and you need to do better and that there’s going to be disciplinary action. You might get fired, you may not be called back for another shift, you know, whatever else.
That’s the part that creates a ton of stress for people, right? When you don’t actually know what your goal is and you’re just working blindly, um, and the fear of retribution is still hanging over your head.
Like that’s got to be incredibly stressful, but I don’t even think most people know that that happens. Most of us that don’t work in those positions don’t realize the incredible amount of stress that those workers are under.
Ron Bronson: Boy, you got me thinking three different things at once here. I’m gonna talk about a quote from the book that touches on this, and then I’m gonna talk a little bit about Taylorism, and then I’ve got an anecdote about Finland, because there wouldn’t be a podcast with me on it if Finland didn’t come
Joel Goodman: out.
Ron Bronson: but this is in the beginning of the book, he talks about in many ways, “the jobs that are now being created are less secure, more stringently managed and paid worse relative to the cost of living than ever before and left unchecked. This may well lead to forms of work that are increasingly stressful, injurious and dehumanizing.”
Which is like, uh, yeah. Right. And we, uh, that, we all know. What’s funny is people, of course, these days, again, spending a lot of time online, there are many folks who believe that folks should just really What is it with the most derisive thing people can say to you on any kind of Twitter clone, “Learn to Code,” you know?
Joel Goodman: Yeah. Right. Yeah.
Ron Bronson: So this is the response to anyone having a bad job, which is awful But I think to your point, this is just the beginning of this. In 10 years, certainly in 20 years, this type of work will be the only way people will have ever known to have worked and it will get so efficient that you all know anything different. And that is horrifying to consider.
Joel Goodman: Yeah, it is. And everyone thinks that automation is going to make less work for everybody and more money for everybody. And we’re all going to have way more free time.
And that is not the case because in capitalist structures, the way that capitalism is designed, the people that are in charge are going to make more money. Like just because you’re more efficient doesn’t mean you’re going to make more money. That actually means that, I mean, in a lot of ways, you’re costing less for people because you have way more output and they’re getting paid the exact same amount. And if they can cut corners and maximize their margin and still get people to work for very, very little money doing a good enough job, they’ll continue to do that. It doesn’t expand like.
This doesn’t help anyone in the bottom rungs of the socio-labor structures, make more money and live better. None of it does. None of the automation does it. And this isn’t about taking people’s jobs, right? It’s the automation is not about taking people’s jobs, it’s about changing people’s jobs and decreasing the amount of skill and knowledge that is needed for that work.
So that everyone is on the same level. The same playing field. No one has to be better than anyone else. And it makes that gap to become a programmer or to become a manager, to become someone else that might have a shot way wider and way deeper, right? It, just keeps expanding that chasm. And so, you know, to your point that those comments online, like, yeah, we’ll just learn to code.
Like, I mean, I know that argument. Like, I mean, Douglas Rushkoff, uh wrote a whole book called “Program or Be Programmed” years and years and years ago when they’re actually probably was still a chance for some good people to learn how to program and make some change but that doesn’t fix everything. And unless you’re learning to program and code in very specific ways that, I mean, are honestly still not good.
Don’t they’re, they’re still working to disenfranchise people from their humanity. I don’t know. It’s difficult.
Ron Bronson: We’re upbeat. It’s the most upbeat podcast you’ve ever heard in your life. No,
Joel Goodman: it was dystopian. I said, I said, we
Ron Bronson: We warned you. We warned you from the start that we’re fun. But this book, this topic, this book’s great. The topic, not so fun. Uh, you, you made it, you made a lot of good points.
And, I want to do want to touch on talk Taylorism a little bit. So for the folks at home who are not as read into Taylorism, Frederick Winslow Taylor was an American industrial engineer. In the late 1800s. He was a chief engineer at this company called Midvale Steelworks. He basically made up something called Scientific Management, which we named after him called Taylorism.
The basics of Taylorism, you’re going to hear this. They’re going to go, Oh, that sounds right. It’s take skilled workers and break down their work into a simple, repeatable tasks that managers could control. So this is why your McDonald’s and your Arby’s work the way they do.
But more importantly, in management in business corporations, you study how work gets done, you break down the complex jobs and simple tasks, you create strict rules for each step, use data to monitor the workers, and then you take the knowledge from the workers and give it to management. And so the idea of course for Taylor was to create managerial power without making workers create collective opposition.
And so you have this really deft thing that goes on here to create your labor force that’s like, You may not happy, but they’re not mad at you. While also getting managers, the ability to be able to like “more efficiently,” I’m putting that quotation marks, more efficiently, be able to manage what the outputs are.
Um, obviously we do this at a high level. cause all the algorithms stuff right now, it’s just Taylorism plus computers, right? That’s, that’s the most simplistic way to understand what’s happening.
Joel Goodman: one of the things that stood out that hadn’t, I don’t think it had been distilled this way to me. It’s when you’re building that kind of data layer, which, you know, eventually became algorithms. I mean, it was algorithms back then too, but more sophisticated algorithms today.
There’s no one to take the blame, because there’s so much of a gap between the workers, supervisors and management. Management’s learning all of this data and everything’s coming back up, but Oh, it was the algorithm’s fault. Like, if something happens on the floor or whatever, Oh, the algorithm did it, you don’t have to take any sort of responsibility for it.
And that even further consolidates that power in that upper strata of management. It was just so like, I sit here and I want to give humans the benefit of the doubt that they’re not doing this to basically create another form of slavery that doesn’t look as much like slavery because someone’s getting some money, even though they can’t live off of it but it’s just that subjugation is.
it seems so purposeful and when you see things like how Taylor wrote that framework and how it’s been applied and how far it’s been taken, like, man, it really doesn’t make me want to be a capitalist, you know, like,
Ron Bronson: what’s, what’s funny about this, cause you, you’re touching on this a lot, and I agree for the record, but I, I want to engage with this, with somebody listening to this, and thinking to themselves. Oh man, but I love the market, and the market is right, and you can’t deny all the good things that have happened.
I’m gonna concede you, uh, naysayer, that. Right? Like, you’re not in this conversation, but I’m gonna concede you that. The bottom line of what Joel’s saying, and what this book’s talking about, and really what these arguments are, or what ultimately we can say is that, these things don’t have to operate this way.
You can still have all of this stuff without it being extractive, right. And, and through my writing and a lot of the work I’ve been doing over the last year, really beyond this Consequence Design idea, which we don’t even get into in this talk at all, it’s talking about something like this intermediation, which is really about using technology to remove human relationships and expertise that we have as humans.
Right? And use it to depend on platforms instead, Using technology to be the intermediary. Because if the machine learns what I know how to do once, it can do 80 percent of what I can do. That 20 percent is the most valuable part of what I can do. But you, the person designing the algorithm, do not care about that.
All those edge cases. Your example about, not being able to have someone to interfere or intervene when something’s bad is my biggest, like the biggest argument that I make is that people on the ground should be able to leverage their common sense to solve solvable problems.
Like, I always use the example that I remember of that’s most salient for me as being a kid at the department store, having a problem on the floor. I want to speak to a manager because, you know, as the customer that the manager is empowered, to probably solve your problem. You and I both worked a lot of retail growing up, into our college years. And so we both know that if you were, even if you were the key holder, forget even being an assistant manager, you had enough of ability to be like, Hey, look, if somebody comes in and something’s broken or they need a refund, go ahead and deal with it because it’s easier to solve the problem at this level.
And what I say is what I appreciate about that model, isn’t even that the managers allowed it? Is that the companies understood at their level that I would rather have you refund someone 50 bucks, 100 bucks, one off? Like, someone’s going to grift you, eventually you’ll be smart enough to stop that. But I would rather you do that than have to deal with unhappy customers or long lines or other problems that are going to cause the company a much bigger problem.
And so I love that that was the company, that was the world we grew up in. That world really doesn’t exist anymore. Like and like maybe at your local coffee shop maybe. But increasingly that’s less of a thing that people these days are adept at being able to navigate that. And it’s really wild to me because we, we solved this.
We already solved it. We did it. We saw it. And now we’re retreating to this other place. It’s really interesting to me.
Joel Goodman: Yeah. And it’s, not about customer service or hospitality or, any sort of public goodwill it’s, it’s about more revenue for, you know, whatever the corporation is, and the knock on effect of outsourcing that to the algorithm to be as efficient as possible is, well, sure. Yeah, you’ve increased that revenue and it’s going to keep getting as efficient as possible.
It’s going to keep tightening that screw right? On the other side of it, those people at the top that put the algorithm in place can still go home and sleep at night because they know they didn’t make the decision.
The algorithm is running everything. They can be the talking head on television and they can do the interviews and seem like a person, you know, come across as a human being. But I, I look at all these structures. I look at the, emphasis on efficiency, especially in tech, because those are the circles that we run in now.
Right. And it worries me when I see, civic organizations being taken over by technology CEOs, or like those people coming in to help make that sort of thing efficient because efficiency in, say government, like government’s designed to be bureaucratic because it deals with real people and affects real people.
It’s not just about making money. And so when you put that sort of application of efficiency and revenue and savings beyond all costs, you know, at all costs and, take away that humanity, you’re no longer building organizations that are for the people they were built to serve anymore.
Right. There’s a reason bureaucracy exists. And it exists to make sure that. There are checks and balances on that to make sure that people actually still have a voice. That people are actually heard and are present in the process.
And I think the same thing applies to businesses. It’s so small hometown businesses that have real people in them. They’re the ones we like, you know, the ones that we normal people like going to and they’re the businesses we still frequent.
It’s the cold, technology based ones that we don’t, and you see it with social media, like you see it with public circles that we’re in online and how at some point they get to this point where the algorithms start to just make it cold and no longer appealing to you, and then you complain about, Oh, you know, my Twitter feed is whatever, no Facebook got to, Ooh, You know, salesy, and I mean, my Instagram has got all kinds of ads in it.
You know, it’s like all that kind of stuff just kind of piles on and, you know, it, it doesn’t just affect the people that are working in it though, it probably affects them the most. It starts to trickle into how we think as a society and as a culture, and it makes us more accepting of things that really aren’t good for us and really aren’t good for the rest of society.
Ron Bronson: Wow. To be fair friends, this is how we talk to each other. So like, we don’t need a book to do this. This is literally how we engage. It’d be a little bit less dense, but this is basically just, this is just how we would roll normally. So I, I think as it, as it relates to sewing up the rest of this book, I think that Yeah, really appreciated the relative pithiness of it. I think it’s relatively accessible as a book relative to a topic like this.
I read a lot of books about this kind of stuff, you know, relative to logistics and things like that, and this is a pretty approachable, I feel like, I mean, uh, book
Joel Goodman: The Marxist labor theory. Marxist labor theory is a little bit dense, so like be prepared for that. If you’re jumping in, like you
Ron Bronson: I think most people
Joel Goodman: helps how bad helps
Ron Bronson: skip that and get to the thingies. This is a book you can skip around, I feel like. And just engage with the parts that are interesting to you. Say you’re someone who’s like hella into Instacart, and you use all these platforms, and you love this stuff all the time, and it doesn’t occur to you why they’re so bad, and I think something like this could be really interesting for you to kind of engage with the higher levels of this. Cause this isn’t about today, this is about the world that your kids, your friends kids will grow up in.
I think that’s the thing, and it’s not again, it’s not about, I am definitely on the record as a, as a naysayer and as a critic of a lot of things, so I want to be very, very clear about my bonafides here, that I think all this shit is toxic and shouldn’t exist. But it’s too late. It already exists, and it’s going to continue to proliferate. So we, as individuals living in this world, have to ask ourselves, where are the bright lines to being able to say, Okay, it’s gonna happen, it’s gonna exist. AI, a la large language models, this stuff’s gonna exist. But it doesn’t mean it has to exist in the framing that the people who are flooding our markets with it, have to exist with it.
We can say there are ways to be able to New York city’s basically banned Airbnbs and people of course hate that if they’re visiting New York city, but the residents love it. And I think that’s great.
For people that live in these places to decide, you know what? We tried it, there are pros, but the cons don’t work for us. And so here in this place, we’re not gonna allow that. And we see that with other things. And so, I’m glad we chose this as our first book. I think that, to your point, this is definitely not a, it’s not a moderate tone.
This book is definitely from a perspective of definitely the left wing. Um, so do engage with it with that understanding. But I do think that no matter where you stand in that spectrum of things, or if you don’t engage that spectrum at all, I think that there are still really salient things for you to get from how things like this isn’t made up.
These things are actually happening in the real world. And we as users of these platforms and these tools need to contend with what is happening, because to summarize sort of a lot of my research and a lot of my work on this is really just like the externalities of these platforms, all the delivery drivers, all the, the, the, the, you know, the delivery food and all these things, your communities are having to contend with this.
And it’s costing people jobs, their livelihoods, their dignity. And over time, your community is going to continue to deal with this because your society wasn’t meant for that. I’m disengaging with any of Joel’s discussion about bureaucracy right now because we’re in a very weird time of bureaucracy and I don’t want to get on a separate rant.
But, you know, once we have Patreon, you can get that in the separate edition of our show. Because then I’ll really, I’ll go ham then. But we’re not there yet. But to that point We really are seeing this post pandemic. We’re seeing the side effects of what it looks like to be in the early stages of the gilded platform age.
And how the externalities of this are wrought onto our hometowns and into our states. And we really have to start contending with this because it’s not going to improve if we don’t figure out what the costs are of letting all of this stuff just dump all over our towns, when the laws, the policies, and the society haven’t caught up with it yet.
We haven’t caught up with this yet. So, we have a lot of work to do.
Joel Goodman: We do. And I also want to point out along with that, Ron, like, I think that Craig Gent in writing this is he’s actually fairly optimistic and he’s way more optimistic than I am about most things, which is why I kind of like latched onto those more darker insidious sort of, points that he made. He opens it up for, you know, Us to have ideas about how to fix this stuff.
Right. And so like, that is the point of this book. He’s not just all doom and gloom. it’s a factual account. Like you said, it’s a factual account of what is going on, it’s an exploration and complete critical engaging with how different management philosophies, are integrated into this and are perpetuating these sorts of things.
He goes on to talk about how these systems can be disrupted by the workers, by those of us that are on the outside. Like there are ways to contend with this. But it does call out for us to do something about it.
I’m hoping that this podcast and the future books we read and the conversations that we have, I’m hoping that inspires some sort of thought and conversation around these topics and. For the record, I want to be involved in it. So like, if you’re listening to this or a future show, like come at us, like bring us, bring us some ideas and some questions, like let’s have additional conversations, let’s keep this going.
Ron Bronson: Seconded. Speaking of our next book, so you can already get started reading, cause we’re gonna be reading it and then we’re gonna talk about it. It’s Liz Pelly’s new book called Mood Machine, the rise of Spotify and the cost of the perfect playlist. Joel and I really became true friends because our shared love of music, our shared love of playlists and stuff.
And so we’re really excited to dig into this one. We know Spotify sucks. They’ve ruined whole platform. Spoiler alert. But I am really excited to see, uh, to read this book and to, I bought it, but I haven’t really delved into all of it yet. And I’m really excited to, to dive in and talk about, this, topic.
Joel Goodman: And you know, it lets us look at a different side of algorithms, and what algorithms are doing for us. So I’m, I’m also stoked. So make sure you find that book. We will put a, we’ll put a link to, I don’t know. We’ll find out how to get the authors some, uh, some money off of these sales beyond just the normal stuff.
We’ll put a link in the show notes. If you want to grab that book, do it, read along, join us next time. And hopefully we’ll have a, have a good conversation that you can feel more actively participated in.
Ron Bronson: Absolutely. Thanks for listening to the future perfect book club.
Joel Goodman: You can find Ron Bronson @ronbronson.com, also ronbronson.com is a website and you can find me @joelgoodman.co Also joelgoodman.co as a website. That’s really actually pretty easy. I’m glad about that.