The power of LLMs and AI tools is precisely that they remove the need to solve. They let you go straight from request to output, no friction necessary. Whether that’s a good replacement for human work comes down to whether the teams who use these tools understand the value that comes with that friction.
The friction from the act of creation creates heat that radiates in every direction. Solving something challenging forces you to understand the problem in a deeper way. It reveals paths from what you thought you wanted, to what you really need. It makes you constantly generate and consider different options, and contains as much insight in the choices you discard as the ones you pursue. At the end, you’ve not only worked on your creation, but the creation has worked on you. You understand its inner mechanisms in an intimate way — whether that’s code, designs, or a written argument — that lets you see that work through a more complex lens.
This is what I think about whenever I see a blog post on whether or not LLMs will replace everyone’s job sometime in the next six months or six years. Not how human workers stack up against text boxes that return whatever you ask them for, but what is gained and lost in the world of solutions without friction.
Is there value in more code, that is understood less? In more writing, with fewer ideas? In more products, aligned less to their users?
I’m not a Luddite or hater here. I’ve gotten a decent boost to my programming efficiency using LLMs as a companion in my process. They can usually create pretty good explanations and code samples when I’m trying to pull off some complex trick in a popular framework like React, and can sometimes give me reasonable debugging ideas when code is malfunctioning.
The assumption I challenge isn’t that LLMs can make useful tools — especially when augmenting and accelerating work. I’m not even challenging that, today or in the near future, teams will be asking AI tools for some of the things they ask people for today.
I question whether the things they ask for will be what they really need. Whether teams will realize that what they didn’t ask for might be more important than they thought. When everything that comes with solving is lost, will they feel the gap it leaves?
Another angle that’s rattled around my mind: it’s easy to see the value in outputs, the nouns. But it’s easy to overlook the value hiding in verbs:
The value not in code, but in coding.
The value not in words, but in writing.
The value not in designs, but in designing.
It’s the one place that, by definition, you can’t automate away the effort — the act of putting in effort. AI tools may provide a new way to get solutions, but it’s the solving where humans can’t be replaced.