The lump of cognition fallacy
Deep Dives
Explore related topics with these Wikipedia articles, rewritten for enjoyable reading:
-
Extended mind thesis
16 min read
The article explicitly discusses this philosophical concept as 'one of my favorite ideas from philosophy of mind' and uses it to argue that cognition extends beyond the brain into our environment and tools
-
Cognitive load
13 min read
The article discusses how built environments are designed to 'reduce the cognitive load of navigating them' and how outsourcing cognition frees mental resources for higher-level thinking
One of the most common bad ways people think about economics is the lump of labor fallacy: the idea that there is a fixed, finite amount of work to do in an economy. The reason this is wrong is simple: doing things leads to more things to do. If you open a car factory, that’s going to generate demand for more mechanics. Mechanics generate demand for more tools and supplies. As an economy grows, there are more things to do, not less. The American economy today has way, way more demand for very specific tasks than it used to. People concerned about immigrants taking jobs sometimes imply that they believe that there is a fixed, unchanging number of jobs in America, and that the immigrants doing jobs will not lead to additional demand for more work. They’re falling prey to the fallacy.
Louis Anslow who runs the Pessimists Archive coined a useful simple extension of this idea: the lump of cognition fallacy: the idea that there is a fixed amount of thinking to do. I see this come up a lot in AI discourse. Like the lump of labor fallacy, the reason this is wrong is simple: thinking often leads to more things to think about.
I was recently sent a funny article in the Guardian about how the author isn’t dating anyone who uses chatbots in part because they’re bad for the environment. Articles like this generate a few easy clicks for my posts. But I was struck by the author’s broader complaint, shared by many people she interviewed, that AI somehow causes people to think less throughout the day. That it’s literally always intellectually lazy to use. Take this quote:
Pereira thinks that using ChatGPT “shows such a laziness”.
“It’s like you can’t think for yourself, and you have to rely on an app for that,”
The article lists a few things that I agree are bad to use ChatGPT for, like writing messages on dating apps. But the author extends this to strongly imply that using ChatGPT at all is causing people to think less, because any cognition the chatbot performs leaves the user with fewer thoughts to think. I would brush this off as silly rage bait if I weren’t also so regularly bumping into the same idea elsewhere. I’ve written about this a bit before, but really want to get across just how ...
This excerpt is provided for preview purposes. Full article content is available on the original publication.