Santa Clara University Bets on AI, and on Doing It Differently

Students walk past the Sobrato Campus for Discovery and Innovation, which houses Santa Clara University's School of Engineering. The School of Engineering will oversee the new Cunningham Shoquist Center for Applied AI and Human Potential. (Nina Glick/The Santa Clara)

On a campus surrounded by the world’s most powerful AI companies, Santa Clara University is betting that its greatest contribution won’t be speed—but restraint.

On April 14, 2026, the University released an official press release for the new Cunningham Shoquist Center for Applied AI and Human Potential, a new interdisciplinary hub designed to bring together students, faculty and industry partners to develop artificial intelligence technologies for the “common good.”

The center, funded by a nearly $25 million gift from Nvidia executive and alumna Debora Shoquist ’76 according to The Mercury News, marks the University’s fourth “Center of Distinction” and its most direct entry yet into the rapidly evolving AI landscape. Beyond the announcement, University leaders say the Center’s real promise lies in the opportunities it will create—for research, industry partnerships and hands-on student learning.

President Julie Sullivan first announced a future unnamed AI Center at her State of the University on Feb. 17, 2026. (Elaine Zhang/The Santa Clara)

The announcement comes as AI adoption accelerates rapidly. According to Stanford’s Artificial Intelligence Index Report 2025, 78% of organizations now report using AI in 2024, while global investment continues to surge. At the same time, public trust remains fragile, with growing concerns over bias, misuse and corporate responsibility.

University leaders say the center is designed to meet that moment.

“This is a huge opportunity for Santa Clara,” said School of Engineering dean Kendra Sharp, pointing to plans to expand research, strengthen graduate programs and deepen connections with Silicon Valley companies. She described the center as a “focal point” that will bring together faculty, students and industry partners, with projects ranging from healthcare applications to transportation safety and education.

Sharp envisions students working directly with companies through capstone projects, research collaborations and industry-sponsored initiatives. The goal is to make the center a “go-to hub” where companies can bring real-world problems for students to solve, while also giving students access to cutting-edge tools and data.

President Julie Sullivan echoed that vision, emphasizing collaboration as central to the center’s mission. “AI is advancing so quickly,” she said, noting that partnerships between students, faculty and industry will help ensure the University remains connected to the “frontiers of AI innovation.” 

Pages from the Anthropic website and the company's logos are displayed on a computer screen in New York on Thursday, Feb. 26, 2026. Earlier this year, President Donald Trump’s administration denounced AI developer Anthropic as a security threat after the company’s attempt to prevent its AI technology from being deployed in fully autonomous weapons or surveillance of Americans. (AP Photo/Patrick Sison).

Yet the same industry partnerships that create opportunity also raise ethical questions.

AI companies have faced increasing scrutiny for their work in defense, surveillance and global security. Firms like Palantir, which holds major government contracts, have drawn criticism from scholars who warn of AI’s growing role in military strategy and geopolitical power.

At the same time, both Sullivan and Sharp stressed that the center will be shaped by Santa Clara University’s Jesuit values, particularly around ethics and human dignity.

“We want to make sure it’s helping augment human potential and human flourishing, and not substituting for that human connection,” Sullivan said. 

Sharp similarly emphasized that technical expertise alone is not enough. As AI becomes more integrated into daily life, she said, students will need strong critical thinking and communication skills to use the technology responsibly. “People need to use AI as a tool for the common good,”  underscoring the importance of ethical decision-making alongside technical training. 

That ethical focus is built into the Center’s design. According to the University’s announcement, the center will address issues such as fairness, transparency, privacy and safety while developing AI applications across fields like healthcare, robotics and information systems. 

For staff at the University working in ethics, that responsibility is central to the center’s potential impact.

Subbu Vincent, who leads media ethics initiatives at the Markkula Center for Applied Ethics, said the new AI center could expand work already underway—but only if it remains focused on human outcomes.

“The goal really is to define problems where the human aspect is the center,” Vincent said. “AI becomes a means to that end.”

That perspective echoes broader conversations within the Catholic Church about artificial intelligence. In a recent Vatican News article on AI, Pope Leo XIV warned against reducing human identity to technological systems, emphasizing the importance of preserving “human voices and faces” in an increasingly automated world.

Still, even as administrators highlight new opportunities, some students say the University must first address gaps in how AI is currently taught.

“Nothing I know now about AI came from SCU classes,” said Sean Wu ’27, a former AI Collaborate club president who has worked on real-world AI deployments. “A lot of it was through clubs and independent projects.”

Wu said one priority should be expanding AI literacy beyond technical majors. “Every major should have some form of AI integration,” he said, pointing to skills like prompt engineering and everyday tool use.

Michael Iwashima ’25, a bioengineering graduate and founder of AI Collaborate, emphasized the importance of accessibility across disciplines.

“Bringing in people from different majors that have nothing to do with computer science, and bringing them in to learn fundamentals and learn how to apply AI creates new innovation in society,” he said.

Wu and others described existing AI coursework as foundational but outdated, often focused on basic algorithms rather than modern tools or real-world applications. For them, the center’s success will depend on whether it improves not just research output, but everyday learning.

Lucas Amlicke ’25, a computer science graduate, said existing coursework often focuses on fundamentals but lacks exposure to more advanced or current applications.

“For a school that’s in Silicon Valley, it’s kind of disappointing,” he said, adding that students should be introduced earlier to more complex, real-world AI problems.

Wu also pointed to cost barriers. Many AI tools require subscriptions, which can add up quickly for students. “You add a couple of them up, and it’s like $100 a month,” Wu said, arguing that broader access could be just as transformative as new coursework. 

Faculty, too, see both opportunity and uncertainty. Michael Kevane, an economics professor in the Leavey School of Business and Faculty Senate president-elect, said there has been little formal discussion in the Faculty Senate so far, reflecting how early the center still is in its development. 

“A robust center would have student fellows at the forefront of understanding the technology and understanding the ethics,” Kevane said.

For now, many details about the center remain publicly undecided, including how it will choose partners and prioritize projects.

The University is attempting not just to build another AI hub, but to define a different model—one that combines technical innovation with ethical scrutiny and broad accessibility.

Whether that model succeeds may depend on how well the center can balance competing demands: industry and independence, speed and reflection, innovation and responsibility.

For now, Santa Clara University is aiming to do all three—building a center that not only develops AI technologies, but shapes how they are used.

Previous
Previous

How SCU Students Turned Waste into Fashion at the 2026 EcoFashion Show

Next
Next

The Impact of 60 Days