Universities are rethinking computer science curriculum in response to AI tools

Skye Jacobs

Posts: 723   +15
Staff
The big picture: The rapid rise of generative artificial intelligence is prompting a fundamental rethinking of computer science education in the US. As AI-powered tools become increasingly proficient at writing code and answering complex questions with human-like fluency, educators and students alike are grappling with which skills will matter most in the years ahead.

Generative AI is making its presence felt across academia, but its impact is most pronounced in computer science. The introduction of AI assistants by major tech companies and startups has accelerated this shift, with some industry leaders predicting that AI will soon rival the abilities of mid-level software engineers.

Universities are now reassessing their curricula. Some educators are considering moving away from an emphasis on mastering programming languages, instead exploring hybrid courses that blend computing skills with other disciplines. The goal is to prepare students for a future in which AI is deeply embedded across all professions.

This sense of urgency is heightened by a tightening tech job market. Graduates who once counted on abundant opportunities now face stiffer competition, as companies automate more entry-level coding tasks with AI.

Some experts suggest that computer science may increasingly take on the qualities of a liberal arts degree, placing greater value on critical thinking and communication skills alongside technical expertise. Mary Lou Maher, a computer scientist and director at the Computing Research Association, told The New York Times that the future of computer science education is likely to shift away from coding and instead emphasize computational thinking and AI literacy.

In response, the National Science Foundation has launched an initiative called Level Up AI, led by the Computing Research Association in partnership with New Mexico State University. The 18-month project brings together educators and researchers to define the essentials of AI education and share best practices. "A sense of urgency that we need a lot more computing students – and more people – who know about AI in the work force," is driving the project, Maher said.

Carnegie Mellon University, a longtime leader in computer science, is among the institutions reimagining their approach. This summer, faculty in its computer science department will meet to consider how best to adapt to the new landscape.

AI has "really shaken computer science education," said Thomas Cortina, a professor and associate dean for undergraduate programs. He supports a curriculum that combines foundational computing and AI principles with hands-on experience using new tools. "We think that's where it's going," he added. "But do we need a more profound change in the curriculum?"

At Carnegie Mellon, professors decide individually whether to allow AI in their classes. Last year, the university approved the use of AI tools in introductory courses. Dr. Cortina noted that many students initially saw AI as a "magic bullet" for completing programming assignments, but often "didn't understand half of what the code was." This realization, he said, has led many to refocus on learning to write and debug code themselves. "The students are resetting."

Across the country, students are adapting to these new realities with caution. Many use AI tools to build prototypes, check for errors, or answer technical questions, but worry that overreliance could dull their skills.

The job search has also become more challenging. Connor Drake, a senior at the University of North Carolina at Charlotte, said he felt lucky to get an interview after submitting 30 applications, eventually landing a cybersecurity internship at Duke Energy. "A computer science degree used to be a golden ticket to the promised land of jobs," Drake said. "That's no longer the case."

To stay competitive, Drake has broadened his studies with a minor in political science focused on security and intelligence, and he leads a university cybersecurity club. Like many of his peers, he's adjusting to a tougher job market. According to CompTIA, job listings for workers with two years of experience or less have fallen 65 percent over the past three years, while postings for all experience levels are down 58 percent.

Despite the uncertainty, many experts believe the market for AI-assisted software will continue to grow. Each wave of technological innovation – from personal computers to smartphones – has historically increased demand for software and programmers. This time, AI tools may enable people in many fields to build their own programs using industry-specific data.

"The growth in software engineering jobs may decline, but the total number of people involved in programming will increase," predicted Alex Aiken, a computer science professor at Stanford.

Permalink to story:

 
"the qualities of a liberal arts degree, placing greater value on critical thinking"


giphy.gif
 
Dr. Cortina noted that many students initially saw AI as a "magic bullet" for completing programming assignments, but often "didn't understand half of what the code was."

LOL This! Applies to AI across all disciplines.

AI is like having an intern work for you. Sure it can do work but it also can't be trusted alone with anything but the most basic of tasks (and sometimes not even those).

But students do assume it will magically do the work - which if it actually did would make said student have no skill to get hired.
 
I see this as a good thing. When I finished my Bachelor's and Master's almost a decade ago in an accelerated 5 year program in computer science (double majored in math as an undergrad and got a minor in physics), several items were not included in the curriculum that really needed to be. For example, the software engineering class didn't even touch on source control or testing, and it irked me to no end that CS1 and 2 were really just C++ programming 1 and 2 - computer science isn't (just) programming! There were plenty of classes that taught more important concepts, like data structures, algorithms, and principles of programming languages, but they needed to have a lot more of that. So much of good software design (if we just stay in the realm of programming) is information management, organization, choosing the right tool (I.e. data structure) for the job, and there wasn't nearly enough of that. I specialized in data science in both math and comp sci programs, and fortunately those classes were quite informative as the discipline requires deep thought, not just getting some code to work. Perhaps I'm placing too high of an expectation on universities to embed that deep best-practice expertise into their students, but focusing less on syntax and programming means more room for the higher level thinking, in my mind. The risk is making sure the AI doesn't become too much of a crutch that the higher level thinking can't be translated into real implementations.
 
Critical thinking has been out of vogue for over a decade. One of the courses that help me the most with work, social, and critical thinking was an elective class I took as a blow off-Logic. Not computer logic...just plain logic. It was amazing how looking at problems and issue change through that lens. Of course, my programming courses were in FORTRAN, and I typed my papers using Jane on my C-128. Not sure if anyone still bothers with it.

Feelings seem to be the guiding principal of the current generation. Tic Toc their primary information source. It will be interesting to see how things go when AI becomes their crutch.
 
LOL This! Applies to AI across all disciplines.

AI is like having an intern work for you. Sure it can do work but it also can't be trusted alone with anything but the most basic of tasks (and sometimes not even those).

But students do assume it will magically do the work - which if it actually did would make said student have no skill to get hired.

And, the more information you feed it the more confused it becomes and starts forgetting earlier instructions.

The only benefit to AI is being able to yell and berate it without HR breathing down your neck.
 
Given that LLM's only work by copying what somebody else has already sweated over, what happens when the real people who create all the code that these LLM's steal lose their jobs?
I wonder when/if the world is ever going to wake up to these AI's and the damage they will do? Huge swathes of unemployment and social unrest while a tiny set of corporations like Google and Meta make huge amounts of money at everybody else's expense? Do we really want to put more money in the pockets of morally-bankrupt human-horror-shows like Zuck, Bezos, Musk and Cook?
 
Back