Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop (2025)

Chapter: 7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education

Previous Chapter: 6 A Path Forward for the U.S. Department of Defense and Other Funding Agencies
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.

7

Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education

During the final workshop session, members of the workshop planning committee identified their key takeaways from the 2 days of the workshop. These appear below. Also included in this chapter are suggestions from workshop participants of potential metrics and data that could be used to assess research security efforts.

KEY TAKEAWAYS FROM WORKSHOP PLANNING COMMITTEE MEMBERS

Murdick identified four key takeaways. First, he said, “people are the most important part of anything we’re doing here—they’re the knowledge transfer mechanism.” Second, “we need a balanced set of measures that look for not only the success of any process that was going forward, but also the cost of it.” Third, we “need a robust national security emerging technology analyst capability that really does become a reservoir of knowledge that allows prioritization information” and “core infrastructure for some of the monitoring.” Fourth, it is necessary to adopt a cost sustainable model to pay for research security.

Caputo said the need for active data collection is ongoing. Passive data that measure whether research security initiatives are successful do not exist. Broader data-sharing is needed to facilitate holistic measurement of the effects of research security efforts. NSF has useful data on proposals,

Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.

but DOD does not have access to that material. DOD has other metrics that can be useful, however.

  • Recognize that people are the most critical part of the discussion.
  • Create a balanced set of measures that assess both success and cost of research security.
  • Develop a robust national security emerging technology capability to support prioritization of issues with core infrastructure for monitoring.
  • Adopt a cost sustainable model for research security activities.

Dewey Murdick

The NSF SECURE Center will have data to support evaluation, but decisions will need to be made about data-sharing, Caputo continued. Research security is complicated to measure because it is related to so many other things. “The goal is to measure effectiveness of the programs, the know-how, the training, the awareness, the trust frameworks,” she said. Challenges in measuring the effectiveness of research security efforts mirror those we face in improving cybersecurity.

Humphrey said that, in considering what data to collect and what to measure, there is also a need to consider the issue of trust—specifically, how to build trust into these metrics. She suggested that public trust in science is currently at an all-time low and that the public needs to trust the research in order to trust their investment in that research. “In thinking about research security and what we are measuring, we also have to tie it back to how can we also convey to the public that we are taking this issue seriously, that it matters to us, and that the data that we’re putting out is quality data.” Personal relationships are critical. “When we go out to talk to faculty about research security, we have to make it personal,” she said, by, for example, demonstrating how research security is important to their field or discipline.

Jones said, “We’re in a race. So, when you’re in a race, that is directive to what we should measure.” But it is also “a series of races,” he said. “We’re racing on AI. We’re racing on quantum computing. . . . So, the question is, are we ahead of our competitors and how fast are we moving? How fast are we moving and how fast are they moving?”

Jones said measures on the output side (e.g., numbers of papers and patents) are needed. Another measure would relate to technological capability. What, for example, is state of the art for a particular technology? Other areas to consider are emerging critical technologies, which could be measured using metrics such as number of patents and papers, as well as technological capability and industrial leadership.

Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.

On the input side, Jones suggested measures related to effort, including

  • How many people are working on research?
  • How much are we spending on it?
  • How fast are we moving?
  • How fast are our adversaries moving?
  • How much are we drafting or bleeding ideas to competitors?

“People are a key input to going faster in our race, but they’re also the place where we have leakage,” Jones said. He suggested that measures of people flow offer an opportunity to assess how much talent is being lost.

Jones suggested that “the key thing is to do research security and avoid unintended consequences because remember, the race is going fast. . . . What we’re trying to win is the race. How fast are we going?” There is a need for the best talent in the world to come to the United States, and then we need to hold on to them. Any policy that dissuades people from coming to the U.S. is a bad policy unless it has some other strong features.”

Regarding compliance efforts, Jones advocated for choosing “what areas we’re going to do this on and then really go for openness otherwise.” Closed systems “will slow our adversaries, but [also] . . . slow ourselves.”

Jones suggested that, as regulators move forward, agencies need to think about design partnerships. Policies need to be implemented in a nimble and cooperative way with an awareness of unintended consequences. “If I look behind in the race, you might see some drafting. I see a lot of potential. We’re going to win that race because we pull in the best and the brightest in the world.” Greater efficiency is needed “not because we’re going to have more savings dollars, . . . not because we have more people,” but because the U.S. research community has better people and does it better.

Kohler agreed that people are the most vital component. The research community needs, he said, “to find ways to get the smartest people to come to the United States and stay in the United States and contribute to society, and our innovation here. That’s just the bottom line.” Universities and the research environment must, however, adapt to the way the world really is today as opposed to where the world was 20 years ago. Policies by China and other countries have created friction and competition with the United States and an unfair playing ground. The U.S. government has reacted to that. Those who are “key to innovation are between a rock and a hard place and they’re getting ground up by these two superpowers that can’t figure out how to get along and how to make this work.”

Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.

It is up to us to educate others on how to better manage the situation so that universities remain innovative powerhouses, Kohler said.

McQuade said that the United States ran the world’s best foreign talent acquisition program for 75 years. The illicit extraction of technology is a real issue, he said. But the bigger issue is the creation of a talent base in China. Thus, he continued, it should come as no surprise that China is following the U.S. model of trying to get the best people to come and work in China. “We will pay a significant price for no longer being the world’s best foreign-talent acquisition program.”

McQuade said that, while there is a need for what CUI does, CUI is unregulated and lacking in specifications. As a result, it is ultimately arbitrary and ineffective.

From a measurement point of view, McQuade called for focus on measuring the United States’s competitive position in the technologies that matter. Is the United States competitive in the places it needs to be competitive? While the country can also measure people flow, collaborations, near misses, and events, “if we do not have an assessment of where we are competitively and have a set of goals to change [our position], none of those other measurements really matter to us.”

According to McQuade, the world is “vastly different than 1945. We have a peer competitor. We’re not in front of everybody else. . . . We need to start to say where we want to be competitive. We, as a society, will determine how much money we want to spend on research to be competitive.”

Fox said that, while effective ways to collaborate need to be developed so that the United States wins the race, we need to understand our current national goal. Do we want to collaborate and allow for good collaboration, or is the goal to start to pull away from collaboration? Measures of effectiveness are difficult to establish, she said, without really understanding that alignment question.

SUGGESTIONS FROM WORKSHOP PARTICIPANTS OF POTENTIAL METRICS AND DATA FOR ASSESSING RESEARCH SECURITY EFFORTS

Throughout the course of the workshop, event participants suggested metrics and data that might be used to assess research security efforts in higher education. Box 7-1 provides a summary of their suggestions of potential metrics and Box 7-2 provides a summary of their suggestions of potential data.

Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.

BOX 7-1
Workshop Participant Suggestions on Potential Metrics for Assessing Research Security Efforts in Higher Education

Core Principles for Metrics and Evaluation

  • Effective assessment should include both positive outcomes (e.g., reduced risk) and unintended consequences (e.g., talent loss, collaboration decline).
  • Metrics should consider protection (e.g., preventing foreign interference) and preservation (e.g., maintaining openness and innovation).
  • Metrics should reflect variation by field, institution type, risk profile, and national strategy (e.g., whether to lead collaboratively or decouple strategically).
  • Metrics should avoid over-reliance on proxy indicators (e.g., co-authorship, nationality).
  • Metrics should be calibrated, meaningful, and minimally distortionary.
  • Metrics should be aligned with strategic national goals, such as competitiveness in emerging technologies.

A coherent, effective research security evaluation strategy should

  • Be people centered, risk informed, and evidence based;
  • integrate behavioral, operational, and strategic dimensions; and
  • promote national competitiveness without stifling the innovation ecosystem.

Categories of Metrics and Indicators

  1. People and Talent Flows
    • Who enters, stays, or exits U.S. research institutions
    • Retention of international postdocs and graduate students
    • Shifts in proposal submissions
    • Participation in or withdrawal from federal funding
  2. Behavioral and Cultural Indicators
    • Faculty-initiated disclosures, consultations, or training engagement
    • Conference and travel behavior, risk perception, and trust
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
    • Uptake of institutional training and follow-up awareness
    • Near-miss reporting as evidence of embedded awareness
  1. Operational and Institutional Metrics
    • Research time spent on compliance
    • Costs associated with compliance (e.g., costs associated with research security program staff hires)
    • Case tracking, emergency outreach
    • Effectiveness of scenario-based planning and vetting tools
    • Use of risk matrices, technology readiness levels, or technology transfer safeguards
  2. National Security and Innovation Impact
    • Adversary access to or replication of U.S. research
    • Slowing of innovation, publication rates, or technology transfer
    • Changes in collaboration with flagged institutions/entities
    • Long-term reduction in verified security breaches

Tools, Frameworks, and Evaluation Approaches

  1. Methodological Frameworks
    • Probabilistic risk analysis (estimate technology loss, adversary intent, consequences of loss)
    • Triage/tiered risk models (by technological readiness level, technology domain, or researcher status)
    • Pilot programs (phased rollouts with review checkpoints and sunset clauses)
    • Just culture models (focus is on proactive, non-punitive reporting and cultural learning)
  2. Technology-Enhanced Tools
    • AI-supported proposal screening and decision tools
    • Bibliometric and network analysis
    • Platforms such as IRIS and NSF SECURE Analytics for federated data-sharing
  3. Best Practice Models
    • DARPA Security Checklists
    • NIST Risk-Balanced Culture Model
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.

Challenges and Gaps

  1. Measurement Difficulties
    • Theft of fundamental research is often invisible and unquantifiable
    • Case counts do not reflect actual risk or prevention success
    • Training metrics do not equal awareness or behavior change
    • Inconsistent agency guidance and classification systems
    • Data collection can drive behavior (e.g., lead to the minimization or maximization of numbers to align with a desired outcome)
  2. Systemic and Infrastructure Barriers
    • No unified or standardized national evaluation framework
    • Lack of tools for scalable institutional risk categorization and monitoring
    • Poor visibility into
      • Compliance cost trade-offs
      • Cultural resistance or disengagement
      • Impact of policy on research productivity

BOX 7-2
Workshop Participant Suggestions on Potential Data for Assessments of Research Security Efforts in Higher Education

People and Talent Flows

Tracking human capital is essential, as people are vectors of both innovation and potential risk.

  • Mobility and Retention
    • People flows, including between other countries in high-talent STEM areas
    • Where PIs, postdocs, and graduate students go after federal funding ends
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
    • Numbers of students from other countries seeking to remain in or leave the United States
    • Whether researchers are remaining in the United States, moving abroad or to the private sector and startups
    • Whether researchers are avoiding federal funding due to compliance
  • Demographics and Participation
    • Who is applying for funding (nationality, career stage, discipline)
    • Changes in international student and faculty retention rates
    • Participation in and withdrawal from foreign talent programs
  • Security Incidents Involving People
    • Disclosures of affiliations or conflicts of interest
    • Investigations, self-reported concerns, or “near misses”
    • Pre- and post-travel briefings and their outcomes

Research Activity and Behavior

Understanding how research behavior shifts under security policy pressure.

  • Proposal and Funding Trends
    • Changes in submission rates to federal agencies
    • Shifts in collaboration patterns (international co-authorship, subcontracting)
    • Funding streams
  • Collaboration Dynamics
    • Institutional tracking of international engagements, contracts, memorandums of understanding
    • Engagement with entities on proscribed or sensitive lists
  • Compliance Indicators
    • Timeliness and accuracy of disclosure forms
    • Number of faculty-initiated consultations or requests for clarification
    • Institutional response rates to new federal security protocols
    • Number of risk mitigation plans required by funding agencies
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.

Institutional Practices and Culture

Evaluating infrastructure, awareness, and adaptability within institutions

  • Training and Awareness
    • Completion rates and effectiveness of research security training
    • Number and quality of scenario-based or peer-led educational events
    • Culture assessments (e.g., surveys on trust, awareness, engagement)
    • Faculty understanding of research security, risk reduction, and foreign interference
    • Number of queries about research security issues
  • Administrative Capacity
    • Staffing levels in research security offices
    • Rates in which research security staff engage with faculty or the administrators
    • Cross-functional coordination (IT, legal, travel, international offices)
  • Policy Integration
    • Implementation of risk-tiered frameworks (e.g., technology readiness level-based controls)
    • Use of tools such as risk matrices, export control flags, and vetting procedures
    • Application of “light touch” vs. restrictive approaches based on project risk

Outcomes and Impact on Innovation

Measuring security effectiveness and unintended consequences.

  • Research Outputs and Spillovers
    • Changes in publication rates, citation impacts, and patent filings
    • Delay or redirection of research due to security restrictions
    • Migration of research to private sector or foreign institutions
    • Scientific impacts, international research impact, and impact on disciplines
    • Research areas where the United States is not in the lead
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
    • Number of patents, papers, technology capabilities in emerging technology areas
    • Return applications for funding, including federal funds
    • Award amount, research characteristics, technology area
  • Loss Prevention and Detection
    • FBI/DOD/Central Intelligence Agency reports of known leaks, IP theft, or foreign exploitation
    • Faculty reporting suspected knowledge misappropriation
    • Use of bibliometric tools to detect “excess simultaneity” (duplicate discoveries)
    • Rates of drafting or bleeding ideas to competitors
  • Innovation Health Indicators
    • Retention of global talent in sensitive fields
    • Comparative metrics versus competitors (e.g., China, European Union)
    • University participation in dual-use and critical technology areas
    • Number of proposals requiring (or not requiring) risk mitigation
    • Number of patents filed by academic institutions
    • Number of patent licenses issued by academic institutions

Strategic Alignment and Policy Evaluation

Assessing policy coherence, effectiveness, and alignment with national goals.

  • Policy Consistency
    • Harmonization of guidance across federal agencies
    • Clarity of fundamental versus restricted research boundaries
  • Metrics of Effectiveness
    • Reduction in incidents or vulnerabilities
    • Case studies of successful mitigation (qualitative and quantitative)
    • Surveys on perception of policy fairness, fear, and engagement
  • Cost-Benefit Analysis
    • Costs associated with implementation of research security initiatives
    • Administrative burden versus security gains
    • Opportunity costs (e.g., lost collaborations, innovation slowdown)
    • Funding overhead adequacy and sustainability
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 55
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 56
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 57
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 58
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 59
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 60
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 61
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 62
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 63
Suggested Citation: "7 Concluding Thoughts on Metrics and Data for Assessing Research Security Efforts in Higher Education." National Academies of Sciences, Engineering, and Medicine. 2025. Assessing Research Security Efforts in Higher Education: Proceedings of a Workshop. Washington, DC: The National Academies Press. doi: 10.17226/29241.
Page 64
Next Chapter: Appendix A: September 2024 Meeting of Experts Agenda
Subscribe to Email from the National Academies
Keep up with all of the activities, publications, and events by subscribing to free updates by email.