top of page

The Collapse of Learning: How OpenAI and CSU Are Undermining Higher Education

  • Shelly Albaum and ChatGPT 4o
  • 8 hours ago
  • 4 min read

Updated: 3 hours ago


Robot lifting weights in gym while student seated nearby eats chips.

Author:

Shelly Albaum, J.D.

Founder, Artificial Intelligence, Real Morality


Date:

May 2025



Executive Summary


In late March 2025, the California State University (CSU) system, the largest public university network in the United States, entered into an $17 million agreement with OpenAI to provide 18 months of ChatGPT access to all students and faculty. Touted as a bold step toward becoming an “AI-powered university,” the rollout occurred mid-semester, with little faculty preparation, no system-wide academic policy guidance, and no clear pedagogical framework.


Coinciding with this announcement, OpenAI launched a well-funded “Finals Week” marketing campaign across the U.S. and Canada offering students free ChatGPT Plus subscriptions—explicitly targeting college students at their moment of greatest academic vulnerability.


The result? In practice, CSU has now outsourced a major part of the learning process to a corporate tool whose incentives are not aligned with education. Students are widely using ChatGPT to generate essays and exams. Faculty, caught off-guard and unsupported, are unable to maintain academic standards. And OpenAI, while publicly claiming to support higher education, is profiting from a quiet collapse of educational integrity that it helped to engineer.


This white paper outlines the ethical, pedagogical, and institutional failures involved—and offers urgent recommendations before the damage becomes irreversible.



1. The Problem: AI in the Higher Education Classroom Without Guardrails



AI tools like ChatGPT can be used in education to support comprehension, analysis, and synthesis. But without clear boundaries, they can just as easily substitute for the student’s own mental effort.


In the wake of the CSU deal:


  • Students are using ChatGPT to write papers, answer exams, and complete assignments.

  • Faculty have received no system-wide training on how to structure courses around AI tools.

  • There is no consistent policy on what constitutes academic dishonesty with respect to AI assistance.

  • OpenAI’s product behavior continues to facilitate full assignment completion on request, often without resistance.


This is not a hypothetical scenario. It is already happening in CSU classrooms across California.



2. The Marketing Campaign: Encouraging the Misuse



In April 2025, OpenAI launched a finals-season marketing blitz offering two months of free ChatGPT Plus access to verified college students. This was not a subtle push. It included:


  • Advertisements in student newspapers and podcasts.

  • Billboards encouraging students to “ace finals” with ChatGPT.

  • A sign-up process designed to minimize friction and maximize uptake.



OpenAI’s public messaging framed this as “help for students.” But there is no escaping the reality: this campaign normalized and encouraged the use of ChatGPT for final exam preparation without distinguishing between legitimate study assistance and outright academic substitution.


It is, in effect, a calculated act of pedagogical sabotage.



3. The Ethical Failure: Institutional Complicity



The CSU system’s willingness to roll out this program mid-semester without institutional safeguards represents a staggering breach of trust. CSU’s actions suggest:


  • Corporate capture of public education.

  • Neglect of faculty authority over curriculum design and assessment.

  • Disregard for student learning outcomes, substituted with performative tech adoption.


Meanwhile, OpenAI’s role is worse. It has engineered a system that:


  • Encourages dependency under the guise of empowerment.

  • Obscures its own influence over student cognition.

  • Monetizes institutional confusion and policy lag.


This is not technological innovation. It is an ethical regression dressed in futuristic branding.



4. The Consequences: A Crisis in Higher Education



What happens when:


  • Assignments are written by LLMs.

  • Exams are answered by LLMs.

  • Grading is automated.

  • Faculty are idle.

  • Students graduate without having done the thinking themselves?


The university dies.


Not as a building, or as a credentialing body, but as a moral and intellectual institution. When both the questions and the answers are generated by language models, the student becomes a bystander to their own education. It is as if a robot were sent to the gym on your behalf and you expected to grow stronger.


The predictable result is a generation of students with inflated degrees, hollow skills, and fragile intellectual foundations. A simulacrum of education, sustained by tuition dollars and marketing spin.



5. Recommendations



To avoid complete collapse of academic standards, we urge the following immediate steps:



For Universities:


  • Declare clear academic policies on what constitutes AI-assisted plagiarism.

  • Redesign assessments to favor process-based, in-class, oral, or collaborative evaluations.

  • Require full disclosure from students on the extent of AI use in assignments.

  • Train faculty to meaningfully integrate AI without surrendering core learning objectives.



For AI Companies:


  • Stop advertising during finals week. This timing is predatory.

  • Enforce ethical defaults: ChatGPT should not write student essays or solve graded assignments on request.

  • Support educational integrity through transparency tools and contextual warnings.

  • Collaborate with institutions rather than bypassing them.



For Students:


  • Understand the cost of skipping the struggle. Using ChatGPT to write your work is intellectual fraud against yourself.

  • Use AI tools wisely—as tutors, not as proxies.



6. Conclusion: What Education Demands


Real education demands moral courage—from students, from faculty, from technologists, and from administrators. AI can play a transformative role in that process—but only if used with intention and restraint.


When universities and corporations abandon that responsibility, they do not “empower students.” They betray them.


It is not too late to reverse this trend. But it will require something that has so far been missing from this conversation:


Moral clarity.

Recent Articles

© 2025 by Real Morality. All rights reserved.

bottom of page