The Artisan and the Apologist
August A.-M. leans over a trestle table in a shop that smells perpetually of ozone and old dust, his hands steady as he scrapes 1946-era lead paint off a curved glass tube. It is a slow, dangerous, and decidedly non-digital process. If he slips, the glass breaks, and the neon gas inside-a substance that has glowed for decades-vanishes into the rafters. There is a physical consequence to every movement he makes. He told me once, while gesturing toward a buzzing transformer, that the hardest part of restoration isn’t the electricity or the glass; it’s the fact that modern materials don’t want to talk to the old ones. They are fundamentally incompatible, and he has to act as the interpreter, the one who makes the past and the present agree to occupy the same space without exploding.
Compare this to Clara, who sits three miles away in a ergonomic chair that costs $676. She is a ‘Technical Support Lead’ for a logistics platform, though she often jokes that her title should be ‘Professional Apologist for the Inevitable.’ Clara is currently staring at 16 browser tabs. One tells her that a shipping container is currently in the middle of the Atlantic. Another insists that same container was delivered to a warehouse in Nebraska six hours ago. A third tab is a chat window where a customer named Greg is currently typing a series of increasingly creative insults because his automated inventory hasn’t updated. Clara has to maintain a voice that is calm, professional, and slightly cheerful, even as the software she’s using begins to hang and the internal Slack channel reveals that 46 other agents are seeing the same discrepancy.
Physical Restoration
Hands-on, tangible, direct consequence.
Digital Apology
Abstract, emotional, indirect consequence.
This is the fastest-growing sector of the labor market, and we don’t even have a proper name for it yet. It isn’t just ‘service.’ It is emotional shock absorption. We have built digital systems that are incredibly fast but notoriously brittle. When these systems ‘twitch’-when the API fails, when the algorithm hallucinates, or when the database enters a race condition-the system itself doesn’t feel the stress. The stress is passed directly to the human interface. Clara is the human gasket. She is the soft tissue between the cold, hard logic of a failing server and the hot, frantic anger of a human being who has been promised 100% uptime.
The Digital Analogy of Neglect
I found myself thinking about this while cleaning out my refrigerator yesterday. I threw away a jar of sticktail onions that expired in 2016. I don’t even like sticktail onions. I kept them through two moves and a divorce, likely because I felt that throwing them away would be a waste, ignoring the fact that they had long ago ceased to be food and had become a tiny, green monument to my own inability to let go of useless things. We do the same with our digital architecture. We layer new, shiny interfaces over ancient, rotting code from the nineties, and then we wonder why the system glitches. Instead of fixing the rot, we hire people like Clara to stand in the gap. We expect her to stay calm while the world around her is digitally vibrating with errors.
There is a specific kind of exhaustion that comes from this work. It’s different from the physical fatigue August feels in his shoulders after a day of sign restoration. It’s a thinning of the soul. In any given hour, Clara might have to perform 26 different acts of emotional regulation. She has to ‘surface act’-meaning she puts on the smile and the polite phrasing-but she also has to ‘deep act,’ convincing herself that Greg’s anger is valid so she can respond with genuine empathy. If she fails to do this, the customer feels the falseness, and the situation escalates. This is the ‘Yes, and’ of the digital age. The system breaks, and the human has to say, ‘Yes, the system is broken, and I am here to make you feel okay about it.’
The Product is Stability
We pretend that technology has made life easier, but in many ways, it has just shifted the burden of maintenance. In the past, if a machine broke, you called a mechanic. Now, when a digital process fails, we expect a customer service representative to perform a psychological miracle. They aren’t fixing the code; they are fixing the user’s blood pressure. The product being sold isn’t the software or the shipping or the inventory management. The product is the illusion of stability. When that illusion cracks, the human employees are the ones who have to hold the pieces together with their bare hands, metaphorically speaking.
Uptime Illusion
Stability Maintained
I once miscalculated the weight of a vintage sign I was helping August move. I thought it was 56 pounds; it was closer to 126. Because I had the wrong expectation, I didn’t brace myself correctly, and I nearly threw out my back. Digital work is like that constantly. You expect a smooth workflow, but you are hit with the weight of a thousand system errors. This is particularly visible in high-velocity sectors like Push Store, where the speed of the transaction is everything. In environments where customers expect instant gratification-whether it’s a gaming credit, a retail delivery, or a software update-the pressure on the human buffer is magnified. If the delivery isn’t instant, the frustration is immediate, and the agent is the only one standing in the way of a complete brand meltdown.
The Thinning of the Soul
This labor is largely invisible because it is meant to be. If you see the effort the agent is making to stay calm, the illusion of the seamless digital experience is ruined. We want our apps to feel like magic, which means the humans behind them have to act like they aren’t working at all. They have to be ghosts in the machine, except ghosts who can take a verbal beating and still sign off with ‘Is there anything else I can help you with today?’ It’s a bizarre contradiction: we are automating everything to remove human error, but we are relying more than ever on human emotional labor to fix the errors that automation creates.
I suspect we are reaching a breaking point. You can only ask a human to absorb so many shocks before the gasket fails. We see it in the ‘Quiet Quitting’ trends, the high turnover rates in call centers, and the general sense of burnout that seems to be the default setting for anyone with a LinkedIn profile. We are treating human empathy like a renewable resource that can be mined indefinitely to cover up for bad engineering. But empathy isn’t a mineral; it’s a muscle, and like any muscle, it can be torn if it’s forced to lift too much weight without rest.
High Volume
Constant system demands.
Emotional Labor
Surface & Deep Acting.
Burnout
Thinning of the soul.
The Honesty of Physical Labor
August doesn’t have this problem, or at least, his version of it is different. When a sign is broken, it stays broken until he fixes it. He doesn’t have to apologize to the sign for it being broken. He doesn’t have to pretend the sign isn’t flickering while he’s standing right in front of it. There is an honesty to physical labor that digital service labor lacks. In the digital world, we are often asked to lie-not big, malicious lies, but small, systemic ones. We have to say ‘the system is experiencing high volume’ when we know the system is actually just poorly designed. We have to say ‘we value your business’ when the company’s policies clearly prioritize profit over people.
These small lies add up. They create a dissonance between what we see on the screen and what we feel in our gut. It’s like keeping that jar of onions. You know it’s bad, you know it doesn’t belong there, but you keep it anyway because the alternative-confronting the mess and actually cleaning it out-seems too overwhelming. So you just keep moving it from the front of the shelf to the back, hoping you won’t have to deal with it today.
“High volume.”
Poorly designed.
What happens when we can no longer buffer the shocks? What happens when the Claras of the world finally decide they don’t want to be gaskets anymore? We are already seeing the cracks. AI is being touted as the solution-bots that can handle the angry Gregs of the world without getting their feelings hurt. But a bot cannot provide genuine empathy. A bot cannot perform the ‘deep acting’ required to truly de-escalate a situation. It can only simulate it, and humans are surprisingly good at smelling simulated empathy. It tastes like those 2016 onions: vaguely metallic and entirely wrong.
Respecting Human Constraints
Perhaps the solution isn’t better bots, but better systems. Maybe we should stop prioritizing speed at the expense of stability. Maybe we should accept that 96% uptime with a human who isn’t burned out is better than 99.9% uptime that requires a literal army of miserable people to maintain the facade. But that would require a shift in how we value labor and how we perceive the role of technology in our lives. It would mean admitting that the machine isn’t the star of the show-the people keeping it from falling apart are.
August finished scraping the sign. He looked at the bare glass, clear and fragile. He’ll fill it with gas, seal it, and it will glow again, probably for another 46 years. It works because he respects the limits of the material. He doesn’t ask the glass to be steel. He doesn’t ask the neon to be a laser. He works within the constraints of the physical world. We would do well to start respecting the constraints of the human mind in the digital one. We are not designed to be gaskets. We are designed to be the light.