- Limit Lift AI
- Posts
- Is AI making us dumber (and - is that a problem?)
Is AI making us dumber (and - is that a problem?)
The addition of AI to our everyday lives will likely require us to redefine what it is to be intelligent. In the meantime, how do we use it in a way that doesn't do more harm than good?
Most of you looking at the title of this piece are probably thinking, well yeah, obviously it matters (and it probably is making us dumber).
While I don’t necessarily disagree, I think it’s important to have a deeper conversation about intelligence, responsible use of AI, and how these ideas will look different in the future.
You might have seen the recent studies (notably one from MIT) circling around about the consistent use of ChatGPT lowering our cognitive performance on basic “neural, linguistic, and behavioral" levels.
While we should note that this study had a small sample size and has yet to be peer reviewed, it's not a stretch to believe the results.
I have reflected many times on how grateful I am that I went through school before LLMs were available for public use - it would have taken away a lot of the struggle, and as a result, the joy of accomplishment through learning. That’s probably why I’m writing this newsletter straight from my brain instead of asking ChatGPT to generate it for me.
There’s a chance you (reader) wouldn’t even notice the difference in the result - but I would. I learn by writing, by creating, and by going through mental struggle of parsing the information I have learned into a format that I think could benefit the reader.
But that is not to say that I don’t use ChatGPT (or other LLMs) for other tasks - after all, this whole newsletter is about the potential that AI has to improve our lives when implemented properly, something I firmly believe in.
So how do we know what tasks to outsource, what to hold on to, and where the benefits of more time and headspace (resulting from AI usage) outweighs the cost? If we are moving towards a world run by AI, does it even matter if we have strong cognitive and linguistic skills? Where would those skills even be needed?
For the record, no part of me actually thinks that we don’t need those skills, but it is still important to consider the questions that feel absurd. We are hurtling towards the greatest technological, societal, and fundamental belief upheaval in all of human history, and we cannot assume that the (very near) future will operate under any sort of parameters that feel familiar.
Let’s say I need a block of code written, and I don’t know how to write code. Is it reasonable to expect me to go through the entire process of learning how to code so I can generate that myself, or is it ok to just ask ChatGPT to create the code for me in a matter of seconds?
Not going through the process of learning to code frees me up to do many other things (like writing my own content, for example). But it also means that I don’t get the benefit of learning new information and going through a lot of problem-solving.
Like anything else, there is a trade-off - and nothing is black and white.
This isn’t a new phenomenon
This trend is of us outsourcing our brains to machines is happening with everything. EVERYTHING is getting smarter, and taking more of the responsibility away from you. Your cars, apps, workout machines, even kitchen appliances, almost nothing is untouched. It has been a slow creep and we have been aware of it (i.e. the whole brainrot culture of social media).
But it’s also brought an enormous amount of good. Freed time up for more meaningful activities. Allowed us to improve our quality of life in a lot of areas.
Are we worse at things like sewing, and building stuff, and navigating? Definitely. But should we stay stuck in the past and refuse to progress for the fear or losing skills? I don’t think so.
The important skills in society at any given time are largely governed by the technology available.
For example - a lot of young people do not know how to write a check - but does that matter if they understand how to pay for something with cryptocurrency?
Cryptocurrency is much more likely to be in common use in their lifetimes. Are they dumb for not knowing how to write a check? Or smart for not spending time learning information that is largely useless to them?
Conversely, if you do know how to write a check, but do not know how to send cryptocurrency, does that make you more intelligent for knowing the one skill, or less intelligent for not knowing the other?
Should “intelligence” in a world governed by AI not include a measure of our ability to utilize said AI to make our lives better/easier/etc?
I’m aware that I’ve just asked about 15 questions in a row - and that’s precisely to show how much we have to figure out with this tech revolution.
Something I really like about the AI tools we have today is how they can drastically reduce the time between idea and action.
Starting a business, coming up with a name for a project, making a study plan, there are countless examples of AI helping us move through tasks that can often cause analysis paralysis and getting us closer to the things that actually matter (action). How is this not a good thing?
Closing thoughts
I’m not pretending to know the answers, but I know that those who thrive in the coming years will be those able to adapt to the tools we have available (and the outsized results they will make possible).
As for protecting your innate intelligence and not becoming lazy, I believe that will rely on remaining conscious of our actions and using discipline to still include elements of problem solving, creation, and ideation in our lives (outside of an AI tool) - similar to how we go to the gym now to stay in shape when most of us no longer have to hunt and gather and walk everywhere (but still would like our bodies to be in good working condition).
Hard work for the sake of hard work isn’t the goal - but losing our humanity definitely isn’t either.
Let me know what you think - how do you use AI in a way that lets you keep your own critical thinking skills?