Workshop planning committee member Benjamin F. Jones (Northwestern University) moderated a panel focused on the impact of research security policies and requirements on the research ecosystem, including hidden potential costs as policies are implemented. He acknowledged the seriousness of research security issues and the loss of ideas abroad and asked whether our actions are creating possible other challenges or possibly slowing down our own success.
The United States, Jones said, is in a race where we need to stay ahead in technology. He likened the current situation to a bike race peloton, where others draft behind the frontrunner to their advantage. While the United States has historically been in the lead, other countries “are kind of drafting and taking some advantage from the United States—pushing hard against the wind of discovery. . . . Maybe that’s not so great,” Jones said, because this is like handing “a free lunch . . . to those chasing us, particularly as they increase their geopolitical competition with the United States. On the other hand, if we put in policies that are trying to . . . prevent others from drafting on our leadership, we are also slowing down our own bike.” Furthermore, he said, if we put “all sorts of extra equipment on our bike [and make] it heavier and harder to pedal . . . we lose the race because we’re focused on the people behind us and forget to train and bike quickly ourselves.”
The first session panelist, Naomi Schrag (Columbia University), said that her institution has about 30,000 full- and part-time students and 4,800 faculty members. The university has students, faculty, and researchers representing 149 countries.
Schrag said that, over the past 5 years, Columbia has taken multiple actions to develop research security measures. These include
Communicating with the research community has been key, Schrag said. Columbia has communicated about research security through targeted or broadcast emails, town hall meetings, webinars, department chair meetings, departmental faculty meetings, and one-on-one consultations.
Schrag said that each piece of research security infrastructure created causes some friction for researchers and suggested that such friction may deter researchers from entering the kinds of international collaborations that are important for advancing innovation. As an example, she noted that collaborations may begin with an act as simple as finishing a manuscript, but that researchers must be attentive to whether, at that stage or some later stage, the collaboration may require disclosure to funding agencies or other steps. The administrative burden associated with such requirements may deter the kinds of collaborations that we need to keep our research ecosystem healthy and vibrant.
The second session panelist, Bhaven Sampat (Arizona State University), provided a historical context for the workshop discussions. He said that World War II was the first time the United States invested in academic research other than agricultural research. This was necessitated by the need to confront the existential threat posed by Nazi Germany. In this period, scientific research for military purposes was coordinated by the Office of Scientific Research and Development (OSRD) led by Vannevar Bush. Although Bush and others were supportive of basic research and the free flow of knowledge, they understood the need to balance openness and security. While the free flow of information could speed the development
of wartime technologies such as radar more quickly and effectively, access to information would also benefit U.S. adversaries.
To address security concerns, OSRD compartmentalized access to information. Scientists working on one project would not share information with scientists working on other projects. During this time, publication restrictions and patent secrecy orders were put in place.
After the war, President Roosevelt asked Bush to draft a blueprint for a postwar science and technology policy. In that document,1 Bush noted that even in a national security context, the free flow of knowledge, or open science, offers substantial benefits (e.g., facilitating scientific collaboration and advancing innovation).
Sampat discussed his recent work examining the impact of secrecy orders on innovation during the war. He has found that these orders successfully kept sensitive technologies out of public view for a period of time. However, they led firms to shift their research away from restricted categories. Sampat said that the follow-on effects on innovation in certain research areas persisted into the 1960s.
Panelist Theresa Mayer (Carnegie Mellon University) said that academic leaders need to do a better job engaging with faculty on research security and related challenges. In such discussions, she said, it is important to step back and recognize that this is a pivotal moment for the United States and the world. On top of the geopolitical competition, as the pace of innovation accelerates (in part because of AI), infrastructure needs are changing.
Mayer said the focus is often on the protection and distribution of sensitive information—but that we also need to consider the implications of restricting research to the point of hindering the efforts of the research community and science. The United States needs to both outpace and out-innovate adversaries.
Fostering an environment that enables and supports the inherent creativity and entrepreneurial nature of U.S. science is critical, Mayer said. Talent, structures, and culture play a significant role in this.
Mayer suggested that DOD has, over the past few decades, become increasingly risk averse in the research it funds. U.S. adversaries, on the other hand, are working quickly, taking risks, and learning from failure.
In considering the interplay between fundamental research, applied research, and controlled research, Mayer said that fundamental research
___________________
1 Bush, V. 1945. Science, the Endless Frontier: A Report to the President on a Program for Postwar Scientific Research. U.S. Government Printing Office.
provides the foundation for other types of research and sits at the frontier of learning and discovery. She pointed to the Defense Advanced Research Projects Agency’s (DARPA’s) research program as an example of a program that funds both controlled and uncontrolled research. Thinking about the coupling of different types of research may require different ways of teaming and different approaches to addressing scientific problems, Mayer said.
Mayer said that she has worked with faculty whose proposals to DARPA or the Advanced Research Projects Agency for Health have been flagged for security review. She said, “I think it is really important to say out loud that” security reviews create “extreme anxiety,” especially for international researchers. Faculty want to do the right thing, but “they are being flagged for joint publications or joint collaborations and simply do not understand going forward what is acceptable and what is not acceptable.”
The session’s final panelist, Susan A. Martinis (University of Illinois Urbana–Champaign [UIUC]), described her research security work and noted the importance of developing relationships and trust. She said that UIUC is working with faculty to create a culture of disclosure.
Martinis emphasized the importance of talent to innovation, national security, and competitiveness. She noted that talent has always been necessary for innovation, but that there is insufficient talent in the U.S. talent pipeline. “We are in a race,” she said. “We need a rapid increase in talent to meet that race.” Talent must be sophisticated and supported by investment—particularly in areas such as AI and hypersonics—and we must create a welcoming environment.
“We are in a race. . . . We need a rapid increase in talent to meet that race.”
Susan A. Martinis
Jones asked panelists to consider the issue of talent and performance measures in the context of national security objectives.
Martinis said that, beginning in the 1980s, students from China who came to the United States to study wanted to remain here: today they want to return home. She suggested that metrics to determine whether the U.S. research environment is welcoming and whether students from other countries wish to remain in the United States could be developed. Mayer said that there are discussions about providing green cards or other methods to
support promising talent and provide a pathway to permanent residency in the United States. Martinis was not aware of any quantitative measures to assess talent.
Most international students come to the United States with the intention of staying in industry and academia—and we have been beneficiaries of this talent, Sampat said. He added that, in most fields currently, research is collaborative and conducted with international partners. Almost every study he has seen suggests that interdisciplinary, cross-national collaboration has high impact. Measures to capture this type of collaboration could be useful, Sampat said.
Jones asked whether research security restrictions related to collaboration may dissuade talent from pursuing particular areas of research because they do not want to be subject to the additional scrutiny and whether this negatively impacts the research ecosystem. Schrag said that she has heard anecdotes to this effect, noting that anxiety around collaboration is high.
Martinis noted that NSF has developed algorithms that identify international collaborations. She suggested that collected data be mined to measure whether research collaborations have increased or decreased. Were these data aggregated at the university level, they could shed light on how collaboration has been affected by research security policies.
Sampat said there has been work examining the China Initiative’s effects on research and researchers.2 Philippe Aghion and others have suggested that the initiative negatively affected Chinese researchers with U.S. collaborators.3 Xie and colleagues have suggested that the China Initiative had a negative impact on U.S. researchers with Chinese collaborators.4 Scientific progress is cumulative, Sampat said. “If you close off the flow of information,” if you “throw too many wrenches in the chain—whether
___________________
2 The China Initiative was a U.S. Department of Justice program launched in November 2018. Its primary goal was to counter national security threats posed by the Chinese government, particularly with regard to economic espionage, trade secret theft, and violations of U.S. export controls and research integrity. The initiative was part of a broader U.S. strategy for addressing concerns about the CCP’s efforts to leverage open American institutions for technological and strategic gains.
3 See Aghion, P., Antonin, C., Paluskiewicz, L., Stromberg, D., Wargon, R., Westin, K., and Sun, X. 2023. Does Chinese Research Hinge on U.S. Co-Authors? Evidence From the China Initiative. CEP Discussion Papers (CEPDP1936). London School of Economics and Political Science. Centre for Economic Performance.
4 See Xie, Y., Lin, X., Li, J., He, Q., and Huang, J. (2023). Caught in the Crossfire: Fears of Chinese-American Scientists. Proceedings of the National Academy of Sciences of the United States of America, 120(27), e2216248120. https://doi.org/10.1073/pnas.2216248120.
through patent or security restrictions—you’ll slow down the cumulative process of scientific progress.” He noted that effects may be disproportionate on some fields.
Schrag said that Columbia University conducts only fundamental research and its statutes do not allow researchers to enter into contracts that would grant a third party the right to censor or restrict the dissemination of research. A change from a more open to less open system could lead to further contraction of the university’s work. It would be a tremendous cultural change to revise university statutes that go back decades, Schrag said.
In response to a prompt about how to encourage researchers to comply with research security policies and procedures, Schrag said that expectations should be made clear. Transparency about what the agency does with disclosed information is incredibly helpful. DOD’s risk matrix clearly describes what the agency is looking for in reviewing current and pending support and biographical sketches. Schrag appreciated that the matrix illustrates to faculty when they need to take action to address potential risks and makes clear that they may not receive funding if action is not taken.
Sampat asked whether there is a way to roll out research security policies gradually in a manner that permits the evaluation of positive and negative effects, noting that doing so would require the objectives of research security efforts to be clear. He suggested that sunset clauses for research security requirements may be appropriate as these would trigger future reconsideration of whether the requirements are still necessary. Restrictions should be as narrow as possible, he said, focusing on specific funding streams or grants, for example. A one-size-fits-all approach to research security will not work for all institutions, Sampat said, given differences in environment, culture, research, and needs.
Mayer suggested that it is unrealistic to expect all faculty and students to remember all the regulatory details of research security policies. Practical examples of research security policies in action are useful and one-on-one engagement on research security topics lowers anxiety. There is value in training videos, but they cannot take the place of one-on-one interactions.
Martinis emphasized that the research ecosystem needs to be dynamic and flexible, moving quickly if needed. We can move fast (as was shown during the pandemic), and there is a race. “We need,” she said, “to be able to really row really hard and pivot . . . very quickly” and we need a network to communicate with senior research security officers and faculty.
Humphrey said that other countries link research security and research integrity and that this allows for independent context and framing by
discipline and for researchers to think about research security and integrity within the context of their discipline. Martinis said that faculty understand research integrity and much work has been done to develop policies and processes in the area. Schrag sees value in separating research integrity from research security, given research integrity’s traditionally narrower focus on potential research misconduct (i.e., falsification, fabrication, and plagiarism) and objectivity in research. She said that there is risk to expanding the definition of research security to include research integrity—as doing so suggests that the same infrastructure that is used, for example, to investigate a research misconduct allegation, should be used in a research security scenario.
Mayer said that, conceptually, it could be useful to develop a community of practice for different research areas. Carnegie Mellon’s energetics program brought together faculty to discuss research security. The conversation was valuable because faculty were able to discuss feedback on grant proposals received from different parts of DOD.
Caputo asked about talent most at risk of leaving academia for venture capital opportunities. Martinis said that faculty are resilient and adaptable because they must constantly look for and apply for funding, but that graduate students, postdoctoral students, and assistant professors are most vulnerable given that their positions are typically less secure in the university system.
Jones suggested that universities are concerned about the physical sciences. He also suggested that the principal investigator (PI) model presents a challenge, given that a single PI is supported by postdoctoral and Ph.D. students and staff scientists. In such a structure, it is not possible for everyone to become a PI and, as a result, some leave academia.
Mayer said that, as we consider metrics, it may be difficult to identify the effect of research security policies. There may be multiple factors influencing decisions to leave the U.S. academic system. We may be assigning migration of academics to research security policies, when in fact the movement represents a normal ebb and flow of early career researchers. She noted, however, that other countries are making investments in research that make them more attractive for researchers who would have historically come to the United States.
Discussion moved to research security in the context of universities who conduct research for industry. Schrag said that Columbia has a significant amount of industry-funded research. While agreements with industry acknowledge that Columbia is an academic institution with the mission
to disseminate the results of research, the university agrees to the sponsor’s prepublication review of manuscripts for the sole purpose of enabling the sponsor to protect its own, preexisting IP developed by the sponsor before the collaboration. If any of the sponsor’s preexisting IP is mistakenly included in a manuscript, the industry sponsor has the right to strike it. This is analogous to input versus output data in the export control context. In such an instance, the university might receive input data from outside the university that it will protect with a technology control plan. If the output of the research is not subject to export control, it would not be subject to restrictions.
Sampat said that universities recognize that narrow licensing of the results of research is not good for their institutions or the world, but that they have refined their review processes to improve their ability to distinguish between what should be broadly licensed and what should be narrowly licensed.
Murdick asked what DOD should prioritize when considering issues of research security, suggesting that award amount, research characteristics, and technology area might be considered. Schrag said that if a researcher intuitively understands that their research is important to, for example, national security, it is easier to move forward with compliance programming. Sampat said that funders should focus on particular funding streams.
Martinis and Mayer said that DOD’s risk matrix could be helpful in defining research security priorities. “The risk matrix has been valuable, bringing clarity and consistency and recognizing that [research security] is an evolving area,” Mayer said.
Martinis said that, when it became public that reviewers of National Institutes of Health (NIH) grant proposals were potentially inappropriately sharing proposal information with foreign entities, research security was elevated as an issue. Increased awareness of breaches could help to strengthen research security efforts and lead to the opening of communications channels between researchers and the government.
Mayer said that different offices within DOD have different definitions of research security. She called for consistency, particularly as there are researchers who may have research projects in several DOD program offices. It would be valuable to have DOD contracting officers and program managers available to discuss the scope of a research program and understand what part of the program carries restrictions.
Nichols asked how to assess the impact of research security initiatives and compliance requirements given a lack of metrics and long-term tracking data. Are conversations and convenings the right approach? Sampat said
that there is both a need for data and a need to invest in the development of measures of effectiveness. “The funders or society needs to tell us what we mean when we say national security. And then we can think about the right surrogate outcomes for it, the right long-run outcomes for it,” he said.
“The funders or society needs to tell us what we mean when we say national security. And then we can think about the right surrogate outcomes for it, the right long-run outcomes for it.”
Bhaven Sampat
Martinis added that pilot programs could be useful to support areas where there is little information about the impact of research security policies and processes. Pilot programs allow for quick reflection and can be redirected if they are not producing the desired result. Mayer said that regular convenings to collect feedback are useful for collecting in-depth information.
Mayer said that questions often come up about foreign influence and international engagement. It is often unclear, she said, who at DOD should address such questions.
Fox recounted that she saw a demonstration of an AI-enabled program by a military command to expedite foreign disclosure information. The tool allowed researchers to identify problematic proposal elements. Fox inquired whether there is a role for technology in flagging risks. Martinis said that technology is a valuable tool: PIs spend about 40 percent of their time on compliance and using technology to reduce that burden would be beneficial.
Jones added that DOD should communicate research security requirements to universities prior to issuing them. “If you end up in a situation where PIs spend 40 percent of their time on regs and compliance, that tells me that we have to work with them and think efficient compliance.”
Schrag said that Columbia has established cross-disciplinary research security working groups. The working groups meet regularly to discuss research compliance, technology transfer, sponsored projects, global travel, and international programs.
A member of the online audience asked about open versus closed science. Whether a system is open or closed has numerous implications on how universities evaluate faculty, scholarships, scientific credibility, research validation, research integration, and IP. Jones noted that an open system invites generic peer review and enables skepticism to weed out bad ideas and to challenge them. Schrag said that “the more open we are, the better prospect we have for regaining, promoting, and enhancing the public trust in what we do.”
This page intentionally left blank.