Ewin Tang is studying quantum algorithms as a grad student at the University of Washington. (Photo courtesy Ewin Tang)

A lot of great discoveries were made while looking for something else. For University of Washington computer science grad student Ewin Tang, research into quantum computing showed that our regular old computers might be capable of much more than we once thought.

Tang’s discovery of a powerful new machine-learning algorithm for classical computers upended assumptions about computing challenges that were thought to require quantum computers. That discovery, made while Tang was studying machine-learning algorithms and quantum computing as an undergraduate at the University of Texas, has enormous implications for both of those fields.

Now enrolled in the UW’s Paul G. Allen School of Computing Science and Engineering as a graduate student at the age of just 18, Tang is continuing to research how quantum computing will impact machine learning. Just last week, two other papers proving her breakthrough result will work with other types of machine learning were released.

“We ended up getting this result in quantum machine learning, and as a nice side effect a classical algorithm popped out,” Tang said in an interview with GeekWire.

Quantum computing is one of the biggest Next Big Things on the tech horizon. It proposes to replace the binary system developed to power old and modern computers, where information was represented by a complicated combination of on and off switches, with a system in which there are more than two ways to represent information.

That could lead to the development of extremely powerful computers that can process information in ways we don’t yet fully understand, but quantum computing is hard. The earliest systems are extremely expensive, and the specialists required to build and maintain those systems are also extremely expensive.

That means a lot of quantum computing research is focused on determining whether quantum computing algorithms will deliver the necessary “speedup,” as Tang puts it, over classical computing algorithms. She is talking about an exponential surge in computing power that will be impossible to ignore and that will become table stakes for the biggest computing companies of our time, such as the two that sit on opposite sides of Lake Washington in the Seattle region, Amazon and Microsoft.

D-Wave’s 2000Q quantum computer has to be refrigerated to near absolute zero (-459.67 degrees Fahrenheit) in order to work. (D-Wave Photo)

Tang proved that classical machine-learning algorithms working on recommendation problems — widely used across media and retail companies — were capable of far more than conventional wisdom held.

As explained by Quanta Magazine, a technical journal on quantum computing, Tang demonstrated that sampling techniques used in a well-known quantum recommendation algorithm could be replicated in classical computers: “Tang’s algorithm ran in polylogarithmic time — meaning the computational time scaled with the logarithm of characteristics like the number of users and products in the data set — and was exponentially faster than any previously known classical algorithm.”

Quantum computing and machine-learning scholars immediately recognized the impact of the discovery.

Recommendation algorithms were once thought to be one of the easiest-to-understand applications for a quantum computer, and research had demonstrated that a quantum algorithm did indeed produce significantly faster results than the best classical computing algorithms. No one, however, had determined whether there was a way to use classical computers to get similar results, until Tang did.

Tang modestly describes her work as pulling string on a bunch of different threads before reaching her conclusion, but such is the nature of important discoveries. “These weren’t pieced together before I noticed it,” she said.

Her discovery suggests that machine learning won’t be the killer app for early quantum computers, and that traditional methods of providing the computing power needed to back those algorithms will have a much longer shelf life than anticipated. Quantum computers will still enable huge computing breakthroughs in a variety of areas, from cryptography to geographic modeling, but will likely be too expensive to justify using in the field of machine learning in their early days.

Tang’s quantum computing research is very much theoretical, and she made a point to note that quantum computing research assumes a certain level of computing power that isn’t necessarily practical in the near future. Companies like Cray, Rigetti Computing, and IBM have released rudimentary quantum computers, but we’re very far away from a day in which quantum computers replace the regular old servers in data centers around the world.

But cloud companies are betting heavily on artificial intelligence research, and have already shown that they will spend billions on the technology that will best power that research. Understanding when that bet makes the most sense will be extremely important to companies like Amazon Web Services, Microsoft, and Google.

Tang, a theoretical researcher, isn’t comfortable predicting the timing of that future: “I wouldn’t say that one of my main research goals is pushing quantum computers into the mainstream,” she said.

Still, establishing the areas in which quantum computing will make a demonstrable different in outcomes will be an extremely important field over the next decade, and Tang cited working with UW professor James Lee on these issues as a big part of the reason why she relocated from Texas to Seattle.

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.