The Biggest Debate Since the Calculator
“Move fast and break things” is a motto that Meta CEO and founder Mark Zuckerberg coined to describe the contemporary approach in which software developers push to implement new features as fast as possible to be ahead of the curve, despite the potential consequences of moving too quickly. Since large-language models came onto the global scene in 2023, the race towards developing the most advanced artificial intelligence models and integrating them into our daily lives has extended to almost every industry, including higher education. The College of William and Mary’s School of Computing, Data Sciences, and Physics, which opened this fall, has adopted a similar techno-optimist approach.
The school’s idea, according to Dean of the School of Computing, Data Sciences, and Physics Douglas Schmidt ’84, M.A. ’86, is that higher education should not reflexively restrict the use of AI, but it should instead supplement learning and completing tasks, especially amid worries that the AI crutch could inhibit students’ ability to learn fundamental concepts in their disciplines. The Data Science Department Chair, Dan Runfola, announced in an email on Aug. 22 that the department is undergoing a shift from “programming” to “programming with generative tools.” Course curricula have already started reflecting this change: Applied Machine Learning, widely known as DATA 305, has officially changed its name to Problem Solving with Generative AI.
In an interview with Flat Hat Magazine, Schmidt explained the CDSP school’s tech-first approach in further detail. He shared his belief that fundamental coding skills are now relegated to the “syntactic” realm in the AI era. In other words, they are the basic building blocks of knowledge, but no longer the richest aspect of knowledge that students should focus on acquiring.
“I no longer have to be aware of some of the low-level syntactic quirks of Java, or C++, or Python,” Schmidt said. “I can work at a level that's much more intentional. I want a program that will do X,Y, Z. So it's not about forgetting, it's about no longer needing to know.”
According to some faculty of the CDSP school, generative AI has now democratized programming to such an extent that anyone can build software tools with some guidance. Runfola spoke about the potential merits of this approach, such as making creative software development more accessible across disciplines.
“We had a government student that had never coded before in his life, like he literally had never touched Python before,” Runfola said. “And with almost no guidance, we were able to give him some quick tips and tricks to build an entire website from generative AI techniques. And so we're really leaning very heavily on this and how quickly we can teach students so they can get their ideas executed without the normal slog of four years of learning how to be a self-engineer.”
Schmidt clarified that the CDSP school’s push to expand AI course offerings, including the College’s new AI minor and future major, is a faculty-driven initiative more than an operational imperative from the administration. He explained that this approach to augmented learning with AI tools might be shared across other departments, especially with the College’s October launch of the ChatGPT Edu program, which provides select departments with an enterprise-level subscription to Open AI’s premium services.
“We're already working with other departments on campus to build AI courses in their departments that students could take for credit towards this degree,” Schmidt said. “So, for example, we're working with the music department right now. I think it's called Musical AI, what they're developing, and we're hopeful that the course could actually contribute as an elective towards this degree program too.”
Schmidt explained that the College has been able to fast-track its addition of AI programming due to a lack of pre-approval requirements from the Commonwealth of Virginia, allowing the CSDP school to develop new course curricula and degree programs in an on-the-go, experimental fashion.
“The reason why the AI minor came first was because we could do that without having to get approval from the State Commission on Higher Education in Virginia,” he said. Schmidt also shared that there are plans to introduce a Bachelor of Arts in AI that will be focused on teaching students how to augment their learning at the College. However, it is unclear if the commonwealth will approve such a major.
In an interview with Flat Hat Magazine, Provost Peggy Agouris revealed that the College, like countless other institutions nationwide, has been largely overwhelmed by the speed at which AI has woven its way into higher education. She shared that past technological advancements were much easier to navigate because they did not threaten to reshape the long-standing processes of knowledge acquisition on which higher education models are fundamentally based.
“I was [teaching] a certain way throughout my career, and I was confident in what I was able to offer my students and how I was able to help them because understanding and assessing knowledge is to help you be better at what you do,” Agouris said. “Now, I don’t know anymore.”
Professor of history Lu Ann Homza expressed concern about the inhibitory effects that she and several of her colleagues believe AI is having on learning outcomes in classrooms, not only at the college level, but also for K-12 students.
“I don’t know if you’ve seen the latest statistics nationwide, that 12th-grade reading and mathematical scores have bottomed out,” Homza said. “They are really, really low, and we can’t tie that to COVID because COVID’s been over for a little while now. I think it is fair to say that many of my colleagues and I are concerned that students are turning to AI for easy answers and thus are not developing intellectually.”
Agouris echoed Homza’s concerns that AI will lead to a loss of vital cognitive and critical thinking skills that were previously acquired without heavy technological assistance.
“I love maps [and] I was very good at navigating before GPS came about,” Agouris said. “And so after that, I punch in things and I lost my ability to have this very great spatial cognition in how I do this. It’s a similar thing [with AI]. We’re losing some muscles that are very important.”
At the same time, Agouris emphasized her desire to not disturb the core tenets of academic freedom that have long been central to the College’s pedagogy. In other words, faculty members who wish to implement AI in their lessons should not be prohibited from doing so.
“I don’t want to interfere with the way our faculty are teaching their courses because I’m a firm believer in academic freedom,” she said.
Agouris addressed the rapidly evolving higher education landscape, which she said can no longer be taken for granted. She recognized the College’s need to be flexible and adaptive with external technological advancements while also safeguarding the core educational objectives of the Alma Mater of the Nation.
“[It is] a difficult time in terms of transitioning from a classical, ‘we know how it was’ kind of world, to a world that is changing on a daily basis,” Agouris said, concluding that AI is “not gonna go away.”
Homza voiced concern about not receiving clear disciplinary guidelines from the College on how to handle students’ use of AI in their assignments, which she said has put professors in a difficult position when managing individual cases.
“We as professors have not been given clear guidelines, in my opinion, as to how we can gauge whether or not AI is in play in our students’ assignments,” Homza said.
Homza commended the Studio for Teaching and Learning Excellence’s proactive guidance in proposing syllabi additions relative to AI, but she still feels that more needs to be done.
“[The Studio for Teaching and Learning Excellence has been] very proactive about workshopping for us and sending out proposed syllabus language for us about the use of AI and trying to establish really clear boundaries about when it’s useful for our classes and when it isn’t useful for our classes,” she said. “But many of my colleagues and I too often feel lost in this new environment.”
Homza shed light on another major blind spot of the AI era from her perspective.
“[I do not know] whether my students have read the articles or whether they’re simply asking summaries from ChatGPT about the articles,” Homza said.
Agouris recognized the high prevalence of such concerns from professors, but also warned against hastily disciplining students for “AI-generated work” that might actually be their own.
“It might risk the situation where we are really penalizing something that was the student’s work,” Agouris said.
Amid the AI wave’s transformation of society at breakneck speed, Runfola underscored the significance of this pedagogical debate by comparing it to major technological innovations of the past.
“[AI is] the most critical debate we’ve had in higher ed since the advent of the calculator,” Runfola said.