Assistants, Copilots… and Now What?
The leap from copilots to independent AI is happening fast. Business schools still think it’s about cheating.
First, students chatted with ChatGPT. Then they asked it to outline their essays. Now, they’re testing the first wave of AI agents capable of planning, researching, and writing entire group projects—solo. In real time. While the group sleeps in. Tools like Kore, Manus, and Genspark and just recently ChatGPT Agent aren’t assisting. They’re replacing. Not hypothetically. Practically.
And this doesn’t just affect group work. It changes everything—research papers, strategy reports, marketing plans, even personal reflections. Any task with a prompt can now be automated, enhanced, and submitted—polished and punctual—without the student ever lifting a cognitive finger.
And what are most schools focused on? Whether it’s “cheating.” They’re missing the plot.
We’re entering a phase where tools don’t just assist—they replace. Not the student (yet), but the need for students to think, plan, or organize on their own. Prompt the agent. Walk away. Get your homework done while you watch Netflix.
This isn’t a future scenario. It’s happening now. And most schools have no plan for when learning becomes optional and effort outsourceable. Their answer? Pretend it’s not happening.
This moment isn’t about banning ChatGPT or others. It’s about building a system that demands judgment—not just outputs. If we don’t push students to challenge their AI partners—probe, refine, dispute, reject—they won’t become sharper thinkers. Just lazier operators. Because “partnering with AI” without friction isn’t collaboration. It’s dependency.
I don’t have all the answers. But I’m not pretending. I’ve integrated AI into teamwork by having students name their LLM “teammate,” track its role and outputs, and write peer feedback—just like they would for a human. Then we debate the results. Not perfect. But it forces thinking.
Why? I’m trying to help them take responsibility for their choices—because it’s only going to get more complicated with AI. And at the end of the day, someone still has to own the outcome. That’s my new role.
Yes, some will mourn the death of craft—students no longer researching, discussing, and writing the hard way. Like musicians not tuning by ear, or illustrators no longer perfecting each stroke. But that’s not the future we’re walking into. Creativity isn’t dying. It’s being reallocated. AI isn’t here to make us do less. It’s pushing us to focus on what actually matters: sharper intent, better judgment, smarter questions. The challenge isn’t learning AI. It’s figuring out what the hell you’re trying to do with it.
There are real concerns about where AI is taking us. And rightly so. But if we’re intentional—if we ask better things of our students, and of ourselves—the future might be brighter than we think. For them, and for us.
AI isn’t the end of education. But it will expose how much of it was performative all along.
We’re the ones designing what comes next. But we’d better move. Because we’re not adapting fast enough—and we’re running out of excuses.