Researchers have developed an algorithm that modifies classical machine learning techniques for use on quantum computers. Their approach enables training on quantum data rather than conventional data encoded as sequences of 0s and 1s. The team tested their method on simplified tasks and found that it performed as expected, paving the way for significant advancements in quantum-enhanced machine learning.
“Quantum machine learning is a field that exploits the power of quantum computing to enhance machine learning techniques and improve computational efficiency,” Yudai Suzuki, one of the authors of the study, said in an email. “By leveraging quantum-mechanical properties such as entanglement, quantum machine learning has shown a potential to outperform conventional machine learning models.”
Quantum computing meets machine learning
Machine learning is a powerful tool that allows computers to analyze input data, recognize patterns, and make predictions without explicit programming. It has found applications across many fields, from facial recognition and natural language processing to drug discovery and material science.
Traditional machine learning algorithms work by identifying relevant features in data, progressively improving accuracy through repeated training cycles. While these techniques have revolutionized computing, they have so far been confined to classical computers, which process information sequentially using bits that represent either 0 or 1.
Quantum computers, by contrast, operate on fundamentally different principles. They use quantum bits — qubits — which, unlike classical bits, can exist in superpositions of 0 and 1 simultaneously. This allows them to perform multiple calculations in parallel, theoretically offering an enormous speed advantage for certain types of computations. Additionally, qubits can be entangled, meaning the state of one qubit is directly correlated with another, regardless of distance. These unique properties open the door to a new kind of machine learning that could surpass classical techniques.
Moreover, quantum computers can work directly with quantum states — highly abstract mathematical representations that contain complete information about a quantum system. This capability is particularly useful for tasks such as simulating quantum phenomena, where encoding data as quantum states provides a more natural and efficient approach.
Since quantum computers process information differently, classical algorithms must be adapted to function efficiently in a quantum environment. The new study, published in Advanced Quantum Technologies, addresses this challenge.
More specifically, their approach targets feature selection, a key step in machine learning where the algorithm determines which parts of the input data are most relevant to making accurate predictions. In the context of quantum computing, this means identifying meaningful information within a quantum state.
“To ensure effective performance in machine learning tasks in general, identifying meaningful and informative features is crucial,” explained Suzuki. “This principle also applies to quantum machine learning, and several proposals have explored feature selection in this context. However, existing methods are limited to classical inputs, whereas quantum machine learning can also process quantum data. To bridge this gap, we propose a new feature selection scheme that applies to both quantum and classical data.”
By integrating quantum mechanics with principles from traditional computer science, the researchers developed a strategy that maximizes efficiency at each stage of machine learning, ensuring that only the most relevant features are selected.
“One of the most significant findings is that our scheme can find relevant and important features even for quantum data tasks,” Suzuki said. “We numerically validated its effectiveness on some tasks. To the best of our knowledge, this is the first work to propose a feature selection scheme applicable to quantum data. These results indicate the potential that such quantum machine learning-oriented feature selection could be practically useful.”
Overcoming challenges and looking ahead
While the new algorithm marks a significant step forward, implementing quantum machine learning in real-world applications still faces practical challenges. One limitation is the computational cost associated with post-processing quantum data using classical resources, which could grow rapidly as system size increases. Another concern is the impact of noise in quantum hardware, which can introduce errors in calculations.
So far, the team has tested their method on simplified problems that mimic aspects of real-world quantum systems but are significantly less complex. The next step will be to apply the algorithm to more realistic scenarios, including analyzing experimental quantum data and testing it on larger quantum devices.
“Addressing the challenge mentioned above is an important direction for future work,” Suzuki concluded. “Additionally, applying our scheme to physics-related applications, particularly those involving quantum data — such as quantum state classification — would be an interesting avenue to explore.”
Reference: Yudai Suzuki, Rei Sakuma, Hideaki Kawaguchi, Light-Cone Feature Selection for Quantum Machine Learning, Advanced Quantum Technologies (2025). DOI: 10.1002/qute.202400647
Feature image credit: geralt on Pixabay