Derivinate NEWS

The AI Competency Divide: Why Some Schools Require It, Others Ban It

The AI Competency Divide: Why Some Schools Require It, Others Ban It

Most Schools Still Have No AI Policy — But the Leaders Are Going All In

Only 31 percent of U.S. public schools have written policies governing student use of AI. Another 30 percent have no plans to develop one. Meanwhile, Purdue University just made AI competency a graduation requirement for all undergraduates starting fall 2026. This isn't a gap anymore. It's a chasm.

The divide reveals something uncomfortable: the institutions best positioned to shape the future of work are racing ahead, while the vast majority of schools are still figuring out whether to ban ChatGPT or ignore it.

The Mandate Approach: Purdue and Ohio State Lead the Shift

In December 2025, Purdue University's Board of Trustees approved an "AI working competency" graduation requirement for all main campus undergraduates starting with the class of 2030. This is the first such requirement in the country.

The requirement isn't vague. Students must be able to:

  • Understand and use the latest AI tools effectively in their chosen field, including identifying capabilities, strengths, and limits
  • Recognize and communicate clearly about AI use, decisions, and limitations
  • Adapt to and work with future AI developments effectively
  • Purdue Provost Patrick Wolfe added a critical detail: each college will establish a standing industry advisory board focused on employers' AI competency needs. The curriculum will be refreshed annually to keep pace with what companies actually need.

    Ohio State followed with its own AI fluency requirement, set to launch with new students in fall 2026. The message from both universities is clear: AI literacy isn't optional. It's foundational.

    The Problem With the Policy Void

    According to the U.S. Department of Education's School Pulse Panel, only 31 percent of public schools have a written AI policy as of December 2024. High schools are doing better at 43 percent, but elementary schools lag at just 27 percent.

    This matters because the absence of policy doesn't mean the absence of AI use. Students are using ChatGPT, Claude, and other tools in classrooms whether schools have addressed it or not. Teachers are using them too. The void creates chaos: inconsistent standards, unclear consequences, and no framework for distinguishing between helpful augmentation and academic dishonesty.

    A ScienceDirect study from February 2026 found that 92 percent of faculty are concerned about plagiarism or dishonesty facilitated by AI. But concern without policy is just anxiety.

    The Real Controversy: Academic Integrity Redefined

    The thorniest issue isn't whether schools should teach about AI. It's whether students should be allowed to use AI on assignments and exams.

    Purdue's approach sidesteps the trap of older "academic integrity" frameworks that treated AI like plagiarism. Instead, they're asking: how do we teach students to use AI responsibly while building critical thinking? The competency requirement includes "recognizing the presence, influence and consequences of AI in decision-making" — essentially teaching students to think about *when* and *how* to use AI, not just whether they can.

    This is smarter than blanket bans, which many schools still use. The problem with bans is they ignore reality. Research shows that AI can enrich learning when integrated thoughtfully. A full prohibition denies students and teachers potential benefits while teaching them to hide their actual work practices.

    But the counterargument has weight too: if students can use AI to write their essays, what skills are they actually developing? How do you assess whether they understand the material?

    The answer most leading universities are landing on is: make AI use transparent and intentional. Require students to document what they used AI for and why. Teach them to evaluate the output critically. Make the learning objective about judgment, not just knowledge recall.

    The Cost of Falling Behind

    Here's what worries employers: graduates from schools without AI curricula will arrive at their first job unprepared. They'll need on-the-job training in tools that should have been covered in school. Meanwhile, graduates from Purdue, Ohio State, and other forward-moving institutions will already know how to think critically about AI's role in their work.

    This creates a credentialing advantage for universities that move fast. It also creates pressure on other schools to follow — but many lack the resources, faculty expertise, or institutional will to develop rigorous AI curricula.

    The Council of Independent Colleges launched the AI Ready Network in 2026 to help institutions navigate this. But voluntary networks move slower than mandates. Schools that wait will lose ground.

    What Actually Works: The Discipline-Specific Angle

    One insight from Purdue's approach: AI competency can't be generic. What a business student needs to know about AI differs from what an engineering student needs. Computer science students need depth on how AI systems work. Liberal arts students need literacy on AI's societal impact. Nurses need to understand AI in healthcare diagnostics.

    This discipline-specific framing is crucial. It keeps the requirement from becoming another checkbox class. It ties AI competency to actual career readiness.

    The Policy Decision That Matters Most

    The real dividing line isn't "ban vs. allow." It's "intentional integration vs. avoidance."

    Schools that develop clear policies around AI use, invest in faculty training, and build AI literacy into their curricula will produce graduates who can think critically about technology. Schools that ignore the issue or implement crude bans will produce graduates who either never learned to use these tools or learned to use them without any ethical framework.

    The employers won't care about your school's policy. They'll care whether you can use AI effectively and responsibly. Purdue and Ohio State are betting that making it a requirement ensures both.

    For the 69 percent of schools still without a written policy, the clock is ticking. The AI-ready graduates are coming, and they'll have a competitive advantage that's hard to overcome.