By Martha Lincoln, Martha Kenney Feb 13, 2025 (SFChronicle.com)

The California State University system plans to integrate artificial intelligence technology into training, teaching and learning at Cal State East Bay in Hayward and its 22 other campuses.Brontë Wittpenn/The Chronicle 2024
Last week, the California State University system announced a landmark initiative to make it “the nation’s first and largest AI-powered public university system.”
The undertaking — a first-of-its-kind public-private partnership between the university system, the governor’s office and a roster of influential tech companies, including Alphabet, Nvidia and OpenAI — aims to integrate artificial intelligence technology into training, teaching and learning at all 23 CSU campuses.
Public university students are certainly deserving of innovative opportunities in their education. But as professors in the CSU system — and researchers on a National Scientific Foundation grant that includes funds for the study of AI in society— we have misgivings about this initiative.
The details of the AI university are murky. It is not clear how artificial intelligence will be integrated into classrooms, monitored or evaluated. The mission of CSU’s new AI Workforce Acceleration Board is also thinly described: Though the program aims to create “a pipeline of AI-skilled graduates” and provide internship opportunities to CSU students, no specific metrics or benchmarks are stated. The board is composed solely of officers at technology corporations; it does not, thus far, include roles for students or faculty to provide input.
Another unknown is how much the CSU administration will spend on this program. A recent report suggests that the 18-month contract to provide ChatGPT to faculty, staff and students will cost almost $17 million. Yet the system also faces proposed budget cuts of almost $400 million and has laid off faculty and staff on multiple campuses this year. At this moment of crisis in the CSU system, it is important to ask whether investment in AI is more important than investment in people. It is conceivable that AI will replace faculty and staff — including advisers, tutors and counselors who provide critical services to CSU students.
Even in the absence of these details, it is not clear that generative AI tools will benefit CSU students significantly or help faculty serve them more effectively. The abilities of generative artificial intelligence are routinely overstated — a phenomenon that computer scientists Arvind Narayanan and Sayash Kapoor describe as “AI snake oil.” While generative AI can create what one group of authors has dubbed “reliable sounding language,” it is not capable of independently evaluating the truth or ethical content of a claim.
Beyond being overhyped, artificial intelligence applications can also be prone to errors. AI tools predictably “hallucinate”— generating outputs that are untrue and fabricated and even, in some cases, violent or racist. Further, generative AI is widely associated with dishonesty, cheating and fraud. Its model for text generation, dependent on web scraping, is intrinsically disrespectful of intellectual property rights. OpenAI — one of the companies participating in the CSU partnership — was sued by the New York Times over the unauthorized use of news articles as training data. A new report shows that Meta pirated millions of books to train its large language model Llama. (Meta has claimed this was fair use.)
Generative AI is a particularly poor fit for university settings, where we teach students foundational skills like reading, writing and critical thinking. Outsourcing assignments to generative AI is intellectual dishonesty and robs students of the opportunity to learn these skills themselves. The extensive provision of AI tools will result in asking less of CSU students — de-emphasizing authentic learning and preparing them for less demanding, de-skilled roles in the workforce.
We already know that AI has been used in applications that inflict social harm, especially when adopted without regulation. For example, as ProPublica recently reported, AI is implemented in the health insurance industry — where proprietary algorithms have been used to deny subscribers coverage for life-saving medical procedures. (Notably, this practice was recently banned in California.)
There are also myriad privacy concerns surrounding artificial intelligence, as recent reports on the new Chinese AI product DeepSeek suggest. It is not clear how the data that students generate in CSU’s AI university will be used or how their privacy will be protected.
The introduction of AI in higher education is essentially an unregulated experiment. Why should our students be the guinea pigs?
About Opinion
Guest opinions in Open Forum and Insight are produced by writers with expertise, personal experience or original insights on a subject of interest to our readers. Their views do not necessarily reflect the opinion of The Chronicle editorial board, which is committed to providing a diversity of ideas to our readership.
Read more about our transparency and ethics policies
The CSU administration has claimed that using AI will prepare students to join the AI workforce. This could be a positive outcome for some graduates. Yet university education is not just job training. A university education should provide students with the skills they need to confront a complex and rapidly changing world — one in which, given our murky information ecosystem, the truth can be hard to discern. Artificial intelligence products have already shown their capacity to spread misinformation and disinformation, mislead the public and undermine democratic processes. Recent research suggests that the use of generative AI is associated with weaker critical thinking skills.
We see the AI university undertaking, at least in its present form, as antithetical to CSU’s mission — one pillar of which is “to prepare significant numbers of educated, responsible people to contribute to California’s schools, economy, culture, and future.” CSU students need to develop the skills of critical thinking, independent thought and respect for difference. Even amid austerity, this requires a well-funded, well-staffed university that invests in people and capitalizes on its existing strengths. For better or for worse, the work of creating educated, responsible people at CSU cannot be automated.
Martha Lincoln is an associate professor of cultural and medical anthropology at San Francisco State University. Martha Kenney is an associate professor and chair of the Department of Women and Gender Studies at San Francisco State University.
Feb 13, 2025
Martha Lincoln
Martha Kenney