Skip to main content

Integrating Artificial Intelligence into Animal Research Facilities

UNC Reimagines Cores and Data Infrastructure, and Facilities Strategy in the Face of Federal Funding Shifts and Scrutiny of Animal Research
Published 5/13/2026

At the University of North Carolina at Chapel Hill (UNC), which ranks 11th in the nation in research expenditures, the question of how to integrate artificial intelligence (AI) into research facilities is no longer theoretical—it is urgent. The institution operates a vast and diverse research enterprise across multiple buildings. That scale, combined with rapid shifts in federal science policy, has prompted a fundamental reassessment of how research infrastructure is designed, operated, and managed.

Craig A. Fletcher, DVM, Ph.D., UNC’s director of the Division of Comparative Medicine, is leading a pan-campus task force to navigate that reassessment. His experience offers a candid look at the structural challenges facing large biomedical research programs—and a detailed vision for how AI and machine learning can help address them.

“We’re looking at how we can use AI and machine learning to improve our understanding in terms of our animal models, but also how technology for models—novel alternative models (NAMs)—could help improve human relevancy,” says Fletcher.

A Changing Funding Landscape

The university had been planning a major Translational Research Building (TRB)—at one point envisioned as a seven-floor, 140,000-sf structure—to accommodate its growing research program. That plan is now on hold. Reductions in federal funding from agencies have disrupted the financial model that was expected to fund operations.

The broader funding environment has also become more competitive. Grant success rates, which once hovered around 11% to 12%, have fallen to roughly 3% to 4%, and the trend is not improving. Meanwhile, state investment in academic buildings—particularly in North Carolina—has been directed toward growing student enrollment, with standalone research buildings an unlikely near-term prospect.

Compounding the financial pressure is a policy shift at the federal level. Both the NIH and the FDA have signaled movement away from mandatory reliance on animal studies, with the FDA beginning to phase out certain animal research requirements. The university, which operates approximately 140 research cores—about one-third of them animal-based—has had to confront what that shift means for its research strategy and its physical plant.

Confronting the Limitations of Animal Models

Part of Fletcher’s task force work has involved a candid internal reckoning with the limitations of animal research—not to abandon it, but to conduct it more rigorously and communicate its value more effectively. 

There is also a communication problem. Fletcher points to a recent Nobel Prize-winning discovery about regulatory T cells and immunotherapy in which the research was conducted in animals, but that fact received no public mention. “We do a very poor job of communicating the value of animals in studies,” says Fletcher. Conversely, researchers sometimes overstate the translational significance of animal findings—claiming to have “cured” a cancer in a mouse without adequately addressing whether the result can be extrapolated to humans. Both tendencies, he argues, have eroded public trust and left the field poorly positioned to defend itself. Furthermore, animal research is often blamed for a drug’s failure to succeed in human trials. Most drugs fail to reach the market, often due to a lack of effectiveness in humans, or commercial and strategic decisions, rather than safety issues missed by animal research.

The AI Integration Imperative

Fundamental research and drug discovery are inherently difficult. Pivoting to AI and machine learning raises the question of how to leverage sophisticated analytics to improve rigor and reproducibility in research studies. Institutions’ traditional research cores often lack a shared database infrastructure; what is needed is a shared informatics infrastructure that unifies data generated by animal models and NAM cores into a single, queryable knowledge system and allow transfer of data across cores. Fletcher describes a scenario in which a decade’s worth of imaging data from cancer studies—currently siloed—could be pooled into a single accessible repository. A researcher conducting a 20-mouse study could effectively draw on data equivalent to a 2,000-mouse study, thereby improving the rigor of the work and reducing the total number of animals used by each research study.

AI and machine learning can improve animal and even non-animal modeling by classifying data, supporting predictive modeling in growth studies, and enabling home-cage monitoring to identify mouse behaviors that serve as surrogates for human disease markers. The technology can replace subjective assessments—manual measurements with calipers, for example, which introduce significant variability across researchers—with objective, automated analysis that removes human bias from the evaluation of study outcomes.

The university is also looking at how agentic AI can identify gaps where current models may not accurately capture experimental and biological details needed to improve discovery. Fletcher describes this as a two-way street: Animal model data can train AI systems that are then applied to contexts, NAMs, which would improve animal and non-animal model development, while AI tools can help to identify which animal model systems are most appropriate for a given research question. For example, by mining large, high-dimensional datasets, deep learning and AI could help generate new mechanistic hypotheses about biological circuits and behavior.

The Data Infrastructure Challenge

The strategic opportunity is clear, but the operational path is not straightforward. One of the central challenges is that people do not know how to leverage the data. Researchers building custom instrumentation—tracking calcium signaling in the brain, pupil reflex, and motion simultaneously while dosing an animal—often rely on research staff to write custom software for data capture and analysis. The result is a highly fragmented data landscape that makes systematic analysis across studies nearly impossible, with each graduate student maintaining a separate Excel spreadsheet or paper notebook.

“One of the things we’re considering is encouraging researchers to digitize their lab notebooks,” says Fletcher. The goal is to centralize that digital data, creating a repository that can be queried, shared, and built upon.

The university is also confronting a technological workforce gap. As core research integrates new technology, staff will need training to support it and the data it generates. The university has created a Research Data Management Core (RDMC) to provide resources and support needed to harness the full potential of research data. Data engineers and software engineers are embedded in research support operations to help investigators use the information being generated. The RDMC offers a range of tools and services to support data management and sharing, including consultations and training, data curation and archiving, and tools and infrastructure.

Facilities Design in an AI-Enabled Research Environment

The implications for building design are significant. “How do you design a building that’s going to need and hold and be accessible to all this data?” asks Fletcher. The university’s evolving facilities strategy reflects a broader redefinition of what a vivarium is. Traditional animal research buildings were organized around species—rodents, large animals, aquatic species. The new model encompasses non-traditional models such as invertebrates, marine organisms, tissue-on-a-chip platforms, organoids, and in silico modeling alongside conventional vivaria.

The task force’s broader mandate is to create a coordinated core system—one that responds to funding pressures and the need to leverage applied physical sciences, advanced analytics, and AI. The goal is a unified front door to all the institution’s cores, enabling scientists to access the full spectrum of available models, from conventional rodent models to organoid platforms to computational approaches, through a single entry point.

“The idea would be that you come in a front door, and now all the cores and their data are accessible to you,” says Fletcher. The data generated could be used not only for internal research but potentially to attract industry partnerships—pharmaceutical companies and contract research organizations that currently outsource preclinical work without knowing that the university’s potential research ecosystem.

Engagement would happen earlier in the research process—before tissue is harvested, before a model is chosen—so that researchers receive guidance on selecting the most appropriate and rigorous approach for their question. The task force is also developing ideas around the need for a repository of standard operating procedures and training materials for researchers .

Animal Research Is Not Going Away—But It Is Changing

Asked directly whether regulatory agencies will accept research packages without animal data, Fletcher is measured. Federal language from the FDA, NIH, and even the U.S. Navy indicates that animal research may not be required for approval—not that it is prohibited. The institution has no plans to exit animal research. What it anticipates is a growing role for non-animal models alongside traditional approaches, and recognizes that the transition to NAMs will bring its own challenges. “The reality is that, with (NAMs), you’re going to have the same issues with rigor and reproducibility. It’s going to be the Wild West,” says Fletcher.

Some phenomena—studying behavior, interrogating systemic immune function, assessing off-target effects—cannot be adequately addressed in cell culture or computational models. Animal models may be needed to validate and refine novel alternative models before those alternatives are ready for broader use. The path forward is not a binary choice between animal and non-animal research, but an integrated, data-connected ecosystem in which each model type informs and strengthens the others—and in which AI provides the analytical backbone to make that integration possible.

***

To learn more about how researchers are using artificial intelligence to maximize their outcomes, plan to attend the Tradeline Animal Research Facilities 2026 conference. Join us in Boston in August!