It's YOUR time to #EdUp
Oct. 22, 2024

If AI Reviewed Our Educational Curriculum Today, What Would It Say?

If AI Reviewed Our Educational Curriculum Today, What Would It Say?
For decades, we humans have been training computers how to mimic our ways of thinking. But as AI brings computers to the brink of new ways of thinking (in some ways we could never have imagined), perhaps it's time for us to learn from them.
 
 
The problem of the 'local maximum' – expending great effort to reach a peak of achievement only to discover that we could have reached a far higher peak if we had only taken a different course – is one that has dogged humans in their professional and personal lives for centuries. But with the rise of mega-tech corporations developing the latest forms of AI while combating local maximums that have the potential to cost them billions of dollars, we can see new ways of addressing our individual local maximums. 
 
Examples of overcoming local maximum à la AI, can be found in many fields of endeavor - from an industry's dependence on silicon batteries to the sports team that has built its strategy around a single-star player. 
 
But perhaps there is no area more in need of a fresh AI-inspired approach than our education system.
 
If an AI-programmed alien landed on planet Earth, and its mission was to scan the way we humans educate our youth, what might it say? The following are three lessons from AI we can use in the context of education. 
 
Encourage exploration and risk
 
Like humans, AI often runs into local maximums, where it gets stuck optimizing something that seems “good enough” but misses the global peak of excellence. But unlike most humans, AI works to avoid this by not settling for what's in front of it and instead taking leaps and exploring very different paths to find better (or the best) outcomes. Can we foster that same mentality in our students?
 
Today, much of the message we send regarding education is about steady, incremental improvement—improve grades each term, inch your way up the career ladder, take one more step forward. But if you peek under the hood of AI development, that’s rarely how breakthroughs happen. AI’s progress tends to leap forward in unexpected ways when it tries something new. To prepare students for the future, we need to teach them to take calculated risks and periodically reach for moonshots, not just chase small, incremental gains. Innovation often lies in those leaps, not in playing it safe.
 
Nurture barrier-jumping agility
 
AI models have evolved to become more adaptable and capable of handling a wider array of tasks. Just look at the progression from GPT-1 to GPT-4, and soon GPT-5. Each generation not only grows in size and complexity but also in flexibility. The deeper the layers, the more the model can ‘think,’ while the number of neurons in each layer adds to its agility and ability to adapt.
 
In contrast, our current educational system tends to reward hyper-specialization. We push students to focus deeply on their chosen domain, and if they take extra courses, they’re usually within the same field, further sharpening that specialization. But AI teaches us something different: in a rapidly changing world, agility—being able to adapt, learn new things, and think broadly—often outperforms deep but narrow expertise. As industries and technologies evolve, the future will likely favor those who are flexible enough to pivot, not just those who dig deeper in one spot. 
 
Keep flexible to stay relevant
 
One of the biggest fears in AI is that models can too easily become outdated or biased, especially when they become trapped in their own comfort zones. As AI continues to learn from the same patterns, it risks reinforcing those patterns to the detriment of its adaptability and relevance. This is not unlike the danger of echo chambers in today’s polarized world, where ideas and perspectives harden, and people become resistant to change or new information.
 
Education should aim to combat this rigidity. We need to teach students to stay curious, continually retrain themselves, and engage with diverse viewpoints. Just as AI benefits from diverse training data to avoid becoming overly biased, students need to be exposed to a variety of disciplines, ideas, and perspectives to avoid intellectual echo chambers.
 
Back to our AI alien. To summarize, it would likely point out that while we are teaching our students valuable skills, we’re not yet teaching them how to think and adapt like the very machines we’ve built. We should be doubling down on encouraging exploration, agility, and the ability to constantly relearn in a fast-changing world.
 
As we prepare students for an uncertain future, we must equip them with the tools to take risks, stay flexible, and think beyond narrow paths. If we can achieve that, we won’t just be educating the next generation—we’ll be future-proofing them.