Quantum Seed Grants Are Funding Solutions to Real-World Problems

'We are realizing a faster pace of quantum innovation—and advancing our state’s role as a leader in quantum science and technology'

(Adobe Stock)

What do underwater navigation, drug safety, and air traffic control have in common? Each creates challenges that quantum science and technology could solve.  

In Connecticut, the unique public-private partnership QuantumCT is accelerating research to meet those challenges head on—and to position Connecticut as a global quantum technology hub.   

As of this spring, nine Connecticut-based research groups have received one-year seed grants for exploratory quantum projects. Each project aims to tackle a “challenge problem” issued by corporate partners in the state, like the need to develop algorithms that simulate molecular drug actions in the body, or to invent exquisitely accurate but hardy sensors that work in extreme environments with little power.  

In other words, the projects are directly relevant to Connecticut industries, including aerospace, biotech, and life sciences. This practical approach to science is called use-inspired research.  

“These grants are fertilizing creative, potentially transformative projects in quantum science and technology across several key industries, all of which are central to Connecticut’s present and future economy,” says Michael Crair, Vice Provost for Research and William Ziegler III Professor of Neuroscience and Professor of Ophthalmology and Visual Science at Yale University.  

The Quantum CT logo.

The seed grants are funded by the University of Connecticut and Yale University and distributed via QuantumCT. Research results will help QuantumCT plan long-term research eligible for competitive funding through the National Science Foundation’s Regional Innovation Engines (a program established through the 2022 CHIPS and Science Act).  

The aim of QuantumCT is to make Connecticut a global destination for quantum education, job training and equitable job growth, research innovation, and industry excellence. 

“Because large universities and industry in Connecticut have joined forces, sharing resources and expertise under the QuantumCT umbrella, we are realizing a faster pace of quantum innovation—and advancing our state’s role as a leader in quantum science and technology,” says Pamir Alpay, Vice President for Research, Innovation, and Entrepreneurship and Board of Trustees Distinguished Professor of Materials Science and Engineering at the University of Connecticut. “The seed grants will fuel not only quantum discovery but also career opportunities in a high-demand STEM field.”  

Each project is collaborative, bringing together researchers from UConn, Yale, and industry partners.  

“The projects foster interactions among a range of researchers from faculty to students to industry scientists, allowing them to pool their knowledge and creativity at top-of-the-line laboratory facilities in Connecticut,” Alpay notes. “These collaborations also offer rising quantum scientists a look at potential career paths in industry.” 

Advanced sensing 

Airplanes, ships, and other vehicles rely on sensors for accurate navigation. But current sensor technologies have important limitations, and five of the project teams are working to develop better ones.  

In one, led by Charles Ahn and Alexander Balatsky—physics professors at Yale and UConn, respectively—the aim is to develop a robust, highly sensitive radiofrequency (RF) sensor that outperforms state-of-the-art directional sensors. To do this, the team is studying how electromagnetic waves interact with atom-sized magnets.  

“We incorporate magnetic atoms on thin films on the nanoscale,” says Dung Vu, a Yale postdoctoral associate on the team, which also includes collaborators from RTX Technology Research Center (RTRC), the research arm of RTX and its three businesses Collins Aerospace, Pratt & Whitney, and Raytheon.  

“By changing properties such as energy and polarization of the light we shine on to the film, we can manipulate the quantum magnet’s properties, then measure the change of the magnetic field around them when they interact with light,” Vu says.  

The devices, Vu explains, can be used to make RF sensors that may be useful for airborne and autonomous vehicles. 

Another team is developing innovative fiber sensors for a magnetic-aided inertial navigation unit for a global navigation satellite system (GNSS). With its extraordinary sensitivity, this technology is designed to operate in environments like the deep ocean and underground, where GNSS signals can be jammed, spoofed, or otherwise unreliable. Electrical engineers Faquir Jain and John Chandy of UConn and Fengnian Xia of Yale are behind that effort. 

“The sensor can detect ultra-low magnetic fields that help with navigation with very low power consumption and cost,” Jain says. 

A key challenge for next-generation magnetic sensors is to limit the devices’ SWaP (size, weight, and power consumption). Currently, the best ones require supercooled liquid helium. A seed project by assistant professors Yu He of Yale (applied physics) and Pavel Volkov of UConn (engineering) is pursuing sensors cooled with liquid nitrogen—a much more user-friendly substance.  

“The results will form one pillar for the eventual theory-experiment-industry collaboration,” Volkov says. 

Highly accurate sensors are vulnerable to miniscule errors and noise in the data. A team led by Yale engineering professor Hong Tang is building ultra-thin silicon nitride microwheels to create a tough, low-SWaP sensor whose round shape is designed to reduce error.  

Meanwhile, Yale associate professor of physics and applied physics Peter Rakich is developing a technique to attach microscopic mirrors to the end of silica fibers, creating a tiny, high-finesse device called a resonator. This resonator should allow precise control of quantum particles, like the ability to couple light particles with ions. That could advance not only sensors, but also quantum computers and networking. 

Computing revolutions 

With quantum technologies poised to revolutionize computing, many industries stand to benefit. 

Quantum entanglement, the eerie phenomenon by which two particles are linked as if they were one, is central to quantum computing and a major reason why the technology is expected to deliver vast improvements. In fact, entanglement can be considered a key resource in quantum computing, and as with any resource, there are better and worse ways of distributing it. Leandros Tassiulas and Shan Zuo, electrical engineering faculty at Yale and UConn, respectively, are studying how quantum computing systems can generate entanglement across multiple users in an equitable way.  

Air traffic controllers, delivery route planners, and factory managers are among the many workers who face optimization problems: how to make actions most efficient. But optimization problems can be fiendishly difficult to solve, especially where there are hard constraints like the need for airplanes to avoid no-fly zones, trucks to refuel, or machines to complete tasks in a certain order. Like classical computers, quantum computers can use heuristics to tackle optimization problems—which remain extremely challenging to solve.  

A joint Yale-UConn team led by Yale physics professor Steven Girvin is exploring whether new algorithms could help quantum computers handle hard constraints on optimization problems. The research should also have relevance to problems like portfolio optimization and risk assessment that frequently arise in domains like finance and insurance, supply-chain logistics, and flight route planning, according to Amit Surana, an RTRC researcher working with the team.  

“The value proposition is that even slight improvements to logistics, even by half a percent, can mean huge savings,” Surana says.  

Progress in life sciences 

Computing is also a focus of two teams led by Yale chemistry faculty members Victor Batista and Tianyu Zhu, which are exploring quantum solutions to problems in drug development.  

One complex challenge researchers face is efficiently identifying drugs that will bind tightly to the intended receptor. Zhu and Batista, with partners at UConn, including physics professor Lea Ferreira Dos Santos and representatives of Mirion Technologies and Boehringer Ingelheim, are developing algorithms that run on quantum computers to tackle this task. 

Drug safety, too, might be improved by quantum computing. New drugs must be rigorously checked for possible toxic side effects on the heart, liver, and immune system. As part of a de-emphasis on animal testing, the industry has been studying the use of classical computing tools like machine learning and artificial intelligence to evaluate possible side effects. But quantum computing techniques remain relatively unexplored.  

So, another group working with Zhu and Batista is developing algorithms that use toxicology data to predict the safety of drug candidates. They are studying a hybrid approach in which a classical computer does a first check for toxicity, then drug candidates that pass that test undergo a further check by a quantum algorithm. Such a hybrid quantum-classical approach is a new and potentially highly effective way to do AI. Project partners include UConn professor Bodhisattva Chaudhuri and researchers with Novartis and Pfizer. 

With this method, says Anthony Smaldone, a graduate student in the Batista lab, “we can remove drugs that are highly likely to fail in testing. Then we don’t have to rely on animal testing so heavily.” 

The hybrid method allows for tinkering that should help researchers determine where quantum computers offer efficiency gains, Smaldone explains. 

“We can slowly change our hybrid models, taking out classical components and putting quantum components in, and see what works and what doesn’t,” he says. “Hopefully, as we’re putting in these quantum components, we can start to see quantum advantages in doing so.”  

Currently, Smaldone says, the team is working with simulations only. Real-world success will have to wait for certain types of hardware and algorithms to catch up. “But this shows the first theoretical framework to do this efficiently,” he says.