Caputo moderated the final workshop panel session. The panel considered data that DOD and other funding agencies may need to collect to measure the effectiveness and performance of research security measures.
Caputo said that relevant data might include data on
With regard to the effectiveness of research security measures, data could be collected on
The session’s first panelist, Gregory F. Strouse (NIST), discussed the agency’s approach to research security and its research security publication
Safeguarding International Science: A Research Security Framework.1 He said that stakeholders need to work together to create a clear, data-driven vision and strategy for research security. Data on research security are critical for institutional leaders and bench scientists.
Strouse said that NIST conducts a security review of everyone who works at the agency. This includes research associates, members of the research security team, subject matter experts, and agency leadership.
Strouse discussed National Security Decision Directive 189 (NSDD-189), a presidential directive that states, “To the maximum extent possible, the products of fundamental research [shall] remain unrestricted.” Furthermore, the directive states, “Where the national security requires control, the mechanism for control of information generated during federally funded fundamental research in science, technology and engineering at colleges, universities and laboratories is classification.”2 Strouse said that both fundamental research and IP are targeted by countries of concern, noting that the U.S. critical emerging technology list3 includes 18 technologies with 136 subcategories. “If you’re doing research in [the emerging technology] space, you’re a target,” he said.
Strouse referenced a recent Chinese news article about a Chinese citizen (a U.S. professor who went to school in the United States) who was recruited back to China. The professor was quoted as saying: “I’m sending all of my students to the U.S. to collect all the intellectual property and bring it home to the motherland.” Strouse asked whether the exchange of fundamental and applied research is as benign as it was when NSDD-189 was published in 1985. Benefits, he said, must outweigh risks. He noted that NIST’s chief counsel has said that balkanizing federally funded research conduct and reporting could adversely affect R&D in the United States.
___________________
1 The framework is “designed to enable organizations to implement a mission-focused, integrated, risk-balanced program through the application of research security principles and best practices that fosters the safeguarding of international science while mitigating risks to the integrity of the open collaborative environment.” See Strouse, G. F., Saundry, C., Wood, T., Bennett, P., and Bedner, M. 2023. Safeguarding International Science: Research Security Framework. NIST Internal Report 8484, https://doi.org/10.6028/NIST.IR.8484nist.gov+2, p. ii.
2 See White House. 1985, September 21. National Security Decision Directive 189: National Policy on the Transfer of Scientific, Technical and Engineering Information, https://irp.fas.org/offdocs/nsdd/nsdd-189.pdf.
3 See National Science and Technology Council. 2024, February. Critical and Emerging Technologies List Update. Executive Office of the President, https://bidenwhitehouse.archives.gov/wp-content/uploads/2024/02/Critical-and-Emerging-Technologies-List-2024-Update.pdf.
Strouse sees NSDD-189, NSPM-33, and NIST internal reports 84844 and 84815 as working together in a positive way to accelerate science to safeguard our ability to have international science.
From a cost perspective, Strouse continued, the research community must never lose sight of the fact that the cost of the highest-end semiconductor is about half a billion dollars. The loss of that recipe before it goes into production is significant. Mitigating risk for researchers includes protecting knowledge prior to publication. Fundamental research, Strouse said, is being used to decide what standards to write. He said that competitor nations review fundamental research conducted in the United States, bet which research will result in commercial applications, and go to standards organizations to create “pre-monopolies in those technologies if they come to fruition.” Strouse suggested that such efforts do not cost competitor nations much, but the endgame has a very important commercial benefit.
Strouse said that NIST’s culture supports research security. The institute has operationalized the implementation of research security measures across five primary areas: foreign and domestic guest researchers, funding opportunities, collaborations and publications, “things that we sell,” and foreign travel. Strouse said that nearly every NIST staffer understands that they are an important part of the research security team as they have learned from the agency about the importance of research security for their work.
Strouse measures the success of research security efforts by the numbers of researchers who come to him with questions or concerns about a potential research security issue. He said the numbers of interactions indicate that those conducting research for NIST are aware of research security issues and comfortable asking questions about them. “We’re there to provide solutions.” When individuals come to you and can accept that “this is a cultural change, not a compliance-driven change,” we have been successful.
Session panelist Jason Owen-Smith (University of Michigan) said that to understand and mitigate research security threats it is necessary to start with a laser focus on people. Innovation flows through social networks and moves with researchers who carry know-how and expertise across boundaries.
___________________
4 See Strouse, G. F., Saundry, C., Wood, T., Bennett, P., and Bedner, M. 2023. Safeguarding International Science: Research Security Framework. NIST Internal Report 8484, https://doi.org/10.6028/NIST.IR.8484nist.gov+2.
5 See LaSalle, C., Howell, G., and Martinez, L. 2023, August 31. Cybersecurity for Research: Findings and Possible Paths Forward, NIST Interagency Report 8481, https://doi.org/10.6028/NIST.IR.8481.ipd.
Owen-Smith noted that discussions around research security frequently center around critical and emerging technologies. Such technologies evolve rapidly, and their development is challenging to monitor and track.
Owen-Smith said that it is challenging to identify the data needed to assess the effectiveness of research security measures, as “they’re simultaneously everywhere and nowhere.” This means that new measurement methods are needed. Furthermore, it is difficult to bound data, as data impact and complement each other: some data are upstream and other data are downstream.
Meaningful data may be collected from analysis of university technology transfer invention disclosures or patent applications filed with companies, government agencies, or universities. Data may also be mined from research citations. Owen-Smith noted, however, that such data are imperfect. He suggested that measurements should focus on people and networks, examining, for instance, stocks and flows of human and social capital. People carry frontier knowledge, he said, and they do it more effectively than documents. Novel findings travel through networks because, at the frontiers of knowledge, much that is important is intangible (e.g., physical skills). Oppenheimer, Owen-Smith recalled, famously said the best way to send knowledge is to wrap it up in a person. Owen-Smith suggested that shifting from talking about tracking ideas to talking about tracking people changes the meaning of privacy, confidentiality, and trust—particularly for academic communities. Risks follow people, he said, but so do opportunities.
Owen-Smith suggested that the research community be engaged in discussions about measurement and approach it as a research problem. Engagement with the research community will “help make sure you get it right. As you do that, you build trust because there is some transparency.”
Owen-Smith said there are mechanisms and model infrastructure to support assessment efforts. For example, the Institute for Research on Innovation and Science6 (IRIS) has gathered restricted transaction-level data from universities around the country on direct cost expenditures of every sponsored project from every funder (i.e., all federal agencies and foreign and domestic foundations and corporations). These data are linked to outcome information and data from usaspending.gov that describes grants
___________________
6 IRIS is “a member consortium of universities anchored by an IRB [Institutional Review Board]-approved data repository hosted at the University of Michigan’s Institute for Social Research. . . . IRIS collects record-level administrative data from its members to produce a de-identified dataset for research and reporting that will improve our ability to understand, explain, and improve the public value of research.” See https://iris.isr.umich.edu/about.
funded by federal agencies. IRIS also utilizes partnerships with statistical agencies like the U.S. Census Bureau and the National Center for Science and Engineering Statistics and links to resources such as the Survey of Earned Doctorates7 and the Longitudinal Employer-Household Dynamics Dataset.8 Owen-Smith suggested that these data could be used to evaluate research security activities at universities. Using such a mechanism, he said, it would be possible to examine risk assessments conducted on past proposals or projects or employees and conduct a retrospective assessment. Such an assessment could help elucidate how various levels of risk played out under different policies in different fields in different conditions. The Federal Statistical Research Data Center system, operated by the U.S. Census Bureau, could support evaluations of the effectiveness of research security measures.
Owen-Smith proposed a framework for evaluation. The first step would be mining bibliometric and proposal text from universities or from agencies. He noted that the CHIPS and Science Act includes a provision that requires all federal science funding agencies to send proposal data to the National Center for Science and Engineering Indicators. Owen-Smith suggested that if that information could be sent with, for instance, risk assessments, the material could be accessed for evaluative purposes.
Owen-Smith suggested that it might be possible to track leakage of IP by, for example, monitoring tacit knowledge flow from a leading group to others. While he acknowledged that there may be instances where discoveries are made simultaneously in different places, “if you can establish a baseline for a field, you can see when you have something like excess simultaneity and you may see more simultaneous discoveries from new places and . . . could validate that retrospectively against known risks or investigations in partnerships” with, for example, federal agencies. It may also be possible also to track PI career paths or measure shifts in knowledge from academic to commercial spaces. He also noted that research data could be made available and their use incentivized via partnerships and mechanisms such as the NSF RoRS program.
Panelist Amanda Ferguson (Huron Consulting) discussed the work of her consulting firm on research administration, compliance, and research security issues. She observed that effectiveness and success in research security “have different, though not entirely conflicting definitions across the perspectives of federal funders, researchers, academic institutions, and the
___________________
overall R&D enterprise.” She suggested that it is important to consider the perspectives of each when considering how to assess the effectiveness of research security efforts.
“We have at least four perspectives to consider: we have researchers; we have institutions that conduct research; we have federal funding agencies and the U.S. R&D enterprise. Effectiveness and success to me have different though not entirely conflicting definitions across the perspectives.”
Amanda Ferguson
Researchers want to do the right thing but do not want to be burdened by compliance requirements, including those that may impact their ability to collaborate with global peers and generate high-impact research. Institutions want to understand where risks are most likely to be present so they can take a risk-based approach to resource allocation and compliance. They also wish to be agile in securing funding for research.
Ferguson said that funding agencies are looking to steward taxpayer dollars, safeguard investments in research, promote U.S. military and economic competitiveness, and promote and protect science and innovation. The U.S. R&D enterprise does not want to be left behind, become isolated, or lose its ability to pursue and attract the brightest global minds. There is, however, a need to balance openness and security. Ferguson said that, while many policies support openness and transparency, other policies have created compliance burdens that do not have the desired protective effect. Some might put a chill on global collaboration to the point where the United States is no longer the premier destination for innovative science.
Ferguson said investment in U.S. R&D has decreased relative to the investments of some foreign nations. For example, according to an April 2025 publication of the American Association for the Advancement of Science, China has outpaced the United States in highly cited publications—a metric that can be used to approximate innovative findings.9 Slippage is particularly evident in fields considered emerging technologies, such as AI and quantum and semiconductors, which are critical components of maintaining a military and economic advantage. Larger factors at play in the
___________________
9 See Zimmermann, A. 2025, April 30. U.S. R&D and Innovation in a Global Context: The 2025 Data Update, https://www.aaas.org/sites/default/files/2025-04/AAAS%20Global%20RD%20Update%202025_final.pdf.
U.S. R&D environment will make understanding the impacts of research security requirements difficult.
Ferguson said that funding agencies often look at publications and affiliations as proxies of trustworthiness. Several affiliations and collaborations can be problematic, but the nature of academia is such that publication measures of collaboration (such as co-authorship) do not always reflect the types of collaborations that lead to licit or illicit transfers. Furthermore, analysis of collaborations requires such significant resources for institutions and funding agencies that it can dissuade researchers from principled collaborations and might dampen innovation.
Ferguson said that the 2024 JASON report offers solutions on safeguarding the research enterprise. The report states that “risk mitigation must consider the spectrum of risk and be adaptable to changing trends in research. Resources should be concentrated on areas of maximum risk to ensure that the benefits outweigh the costs.”10 Ferguson suggested that it would be helpful to maintain lists of emerging technologies and have experts within the government who can advise funding agencies on, for example, the importance of those technologies to adversaries. Ferguson suggested that universities could conduct proactive open-source due diligence if they had more detail about what programmatic areas they should focus on, rather than broad categories such as machine learning. This knowledge could also help researchers understand how their collaborations might be perceived by the government and stimulate them to provide information to funding agencies proactively; this would, in turn, facilitate discussions about risk.
Ferguson noted that the 2019 JASON report11 discussed technology readiness level (TRL) as an indicator by which to assess the necessity of implementing controls to effectuate national security.12 She suggested that
___________________
10 See JASON. 2024, March 21. Safeguarding the Research Enterprise. JSR-23-12. The MITRE Corporation, March 21, 2024, https://nsf-gov-resources.nsf.gov/files/JSR-23-12-Safeguarding-the-Research-Enterprise-Final.pdf.
11 See JASON. 2019, December. Fundamental Research Security. JSR-19-2I. The MITRE Corporation, https://nsf-gov-resources.nsf.gov/files/JSR-19-2IFundamentalResearchSecurity-12062019FINAL.pdf.
12 TRLs are a systematic method for assessing the maturity of a technology developed by NASA. TRLs range from 1 to 9, with 1 being the earliest stage of basic principles and 9 representing a fully developed technology. They are used to evaluate the progress of a technology as it moves from basic research to a marketable product. See https://www.nasa.gov/directorates/somd/space-communications-navigation-program/technology-readiness-levels/#:~:text=Catherine%20G.%20Manning,based%20on%20the%20projects%20progress.
TRL could be a useful metric to consider when evaluating success across perspectives. TRL can be assessed when a proposal is considered and regularly reevaluated via a standard progress reporting mechanism.
The session’s final panelist, Kristin West (COGR), said that metrics on the negative, unintended consequences of research security measures are critical as the United States moves from an open system to a more closed system. She noted, however, that context matters as much as the metrics themselves and that metrics on the success of research security efforts are also important.
West said that the 2024 JASON report emphasized the importance of researcher mobility. She noted that a recent Economist article said that the United States is experiencing an academic brain drain. It reported a “32% increase in applications from U.S. researchers to jobs outside the U.S. and a 25% decrease in applications from researchers outside the U.S. for jobs in the U.S.”13 This, West said, is a problem.
West said that it is important to look at areas of science where the United States is not in the lead and “we have to quit looking at the easy metrics like collaborations and co-authorships and who got funding from who as always being surrogate end points or proxies for bad behavior. They’re not.” They can be proxies for the beneficial transfer of information in the United States. Both metrics and the context of the metrics matter, West said.
West said that the use of “gotcha” metrics can have a chilling effect on research. Avoiding security incidents is a common goal, but incidents happen. It is important to analyze incidents in order to understand “what we did right, what we did wrong, what we should do better,” and no one wants to face the unintended consequences of a failure to report incidents. West suggested that it may be advisable to focus on developing a fair and just reporting culture akin to that in medicine where health care professionals feel safe and empowered to report errors, near misses, and hazardous conditions without fear of blame or punishment. That system empowers medical professionals, no matter what their rank, to acknowledge a problem and not face retribution or punishment. West suggested that training on (or awareness of) “near miss” incidents would be useful—as would education on collaboration with researchers in countries of concern.
___________________
13 See The Economist. 2025, May 21. “America Is in Danger of Experiencing an Academic Brain Drain.” https://www.economist.com/science-and-technology/2025/05/21/america-is-in-danger-of-experiencing-an-academic-brain-drain.
McQuade suggested that the development of metrics be approached as a research problem. Owen-Smith said that such an exercise could be designed as an iterative process that encourages engagement and collaboration. The research community could be engaged in validating tools or in bringing expertise into the initiative. Owen-Smith suggested that a tracking system for collecting evaluation data on, for example, people flow would likely need to build on a research base that the community was involved in.
Caputo said that there will be a need to provide tools to the research community to aid in identifying risk and performing due diligence in reviews. Caputo suggested that asking researchers to do due diligence implies that “the goal is to ensure or at least have some confidence that you are being trustworthy, that you deserve that trust and are reciprocating that trust.”
Fox noted that many have commented on the need for a rheostat, the use (or inappropriate use) of proxies as flags for concerns that do not often pan out, and the notion that we can use lists of critical technology to help us focus these rheostats and to get away from these proxies. “I don’t know,” Fox said, “of any list of critical technologies where we’re behind that could inform university thinking about where these kinds of collaborations are in the safe zone and where the dial should be set higher.” West said that the Trusted Research Using Safeguards and Transparency (TRUST) framework14 will guide this type of work. The framework focuses on novel science and offers a better alternative to reviewing static lists of critical and emerging technologies.
Fox asked how to communicate to universities about research areas where collaboration is encouraged. Strouse said that working one-on-one with researchers and asking key questions can be valuable.
Strouse said that it is important to emphasize that “we’re not just protecting national security. We’re not just protecting economic security. We’re protecting the economic security of the United States. And all those things put together are really critical.”
West said that she has been involved in several studies related to compliance with research security measures. One survey estimated that, in the first year, it would cost a mid- to large-sized university an additional $445,000
___________________
14 The TRUST framework was developed to guide NSF in assessing grant proposals for potential national security risks. See https://www.nsf.gov/news/nsf-enhances-research-security-new-trust-proposal.
to comply with the disclosure requirements enumerated in Section 2233 of the FY2021 National Defense Authorization Act and NSPM-33.15
Ferguson said that, given the movement toward multidisciplinary and transdisciplinary work, there is a need to raise awareness about research security across the board. She suggested starting with something simple that is faculty facing and doing something once a year in a way that feels more truly educational than just “take the training.” And as we “kind of narrow down” into those discipline-specific levels, we can target the most basic administrative level.
Kohler said that the Chinese government produces 5-year plans that discuss critical and emerging technologies; these can inform researchers as they consider research security issues, he suggested. However, he said, most researchers have not read these plans. Strouse added that many resources are available to inform researchers about technologies that may be at higher risk (e.g., the Annual Threat Assessment [ATA] of the Office of the Director of National Intelligence [ODNI]).16 He said that NIST IR-8484 also publishes a list of research security resources.17 Biggs said that “we lock our cars in a mall parking lot, not because we expect someone to come break into them, but just in case someone’s going to.” He suggested that, as the PRC has created an academic system that promotes stealing research and fraudulent research, everyone should have the opportunity—even history professors—to check into the people they are working with to see if they have a history of stealing research. Setting that baseline, he said, makes everyone’s lives easier.
___________________
15 The Act outlines the following disclosure requirements: federal research agencies must require as part of any application for R&D award that each covered individual listed on the application disclose the amount, type, and source of all current and pending research support received by or expected to be received by the individual at the time of disclosure. Furthermore, any entity applying for an award must certify that each covered individual who is employed by the entity and listed on the application has been made aware of these requirements (William M. [Mac] Thornberry National Defense Authorization Act for Fiscal Year 2021, Public Law 116-283, https://www.congress.gov/bill/116th-congress/senate-bill/4049).
16 The 2025 ATA is available at https://www.dni.gov/files/ODNI/documents/assessments/ATA-2025-Unclassified-Report.pdf.
17 See Strouse, G. F., Saundry, C., Wood, T., Bennett, P., and Bedner, M. 2023. Safeguarding International Science: Research Security Framework. NIST Internal Report 8484, https://doi.org/10.6028/NIST.IR.8484nist.gov+2, pp. 57–58.