Amira-Abbas-1 (1)

Will quantum computing help AI?

QuSoft is excited to welcome a new member on board, Amira Abbas, a young talented researcher who did her PhD in South Africa at the University of KwaZulu-Natal and collaborated with IBM Research and Google Quantum AI. She recently joined QuSoft as a postdoctoral fellow. Amira brings to QuSoft her knowledge in a field of research that combines two of the most ground breaking technology topics together: quantum computing and artificial intelligence.

With this interview we wish to welcome her in Amsterdam and at QuSoft.

 

Amira you are going to receive your PhD degree during the month of September 2023, could you give us an idea of what you have been working on during this research period? 

My PhD research focussed on the intersection between artificial intelligence and quantum computing, trying to answer the question of whether quantum computing can help speed up machine learning or create more interesting models for artificial intelligence. The research started by trying to understand what is the power of quantum variational models (QVM) as compared to neural networks (which are employed, for example, by OpenAI in ChatGPT). Initially my research pointed to the affirmative, proving that QVM are somewhat more powerful compared to neural networks under some assumptions.

Thereafter, I completed some work at Google, looking at what happens when we make the same models bigger. Will we be able to optimize them as well as we can optimize classical neural networks? It looks like we can’t in general. This is because of a somewhat natural barrier. With neural networks, one can store and reuse intermediate information in the function to compute things more efficiently, and hence train very large models. With quantum computing, we can’t store and read out information in the same way because tracking the value of the function is not very easy to do without destroying quantum information. This is due to the intrinsic way quantum mechanics works. Measurement destroys information and thus, we cannot simply reuse quantum resources after measuring.

 

This sounds very interesting and we are very curious of how you would like to carry on your research efforts based on this knowledge gained.

Just because we haven’t found a way of training these models, it doesn’t mean that this is the end. Maybe there is a special class of circuits that could still be efficient to train and perform well on machine learning tasks. A possible future direction is to narrow the scope and look for a specific class of QVMs that works. Another direction would be to use the framework of kernel methods to find such an interesting class of quantum models.

 

Could you explain in a nutshell what are they and why are they interesting as an alternative to variational methods?

In machine learning, people care about data which typically comes in the form of classical vectors. It turns out that encoding these classical vectors into a quantum state through some basic operations that depend somehow on the classical data, corresponds to something called a feature map. A feature map literally means one transforms their data, mapping it from one space to another. In the context of machine learning, feature maps are super useful because sometimes data can be really difficult to cluster, group or separate. By applying clever feature maps, this can sometimes demystify the data. Since encoding data on a quantum computer is also a feature map, finding a clever encoding for data would be wonderful. These feature maps serve the basis for kernel methods and kernel approaches. But naturally, finding a “good” quantum feature map is not so easy and there is a lot of active research in this direction.

 

I am impressed to hear about the many conferences you are invited to. Any highlight?

Recently, I was invited to give a keynote talk at the Quantum Artificial Intelligence conference organized by the University of Naples. In November I will speak at the Quantum Techniques for Machine Learning meeting held at Cern. Moreover, I am looking forward to present my results at the regular QuSoft seminars.

 

Amira, as you know, artificial intelligence is a field of science that has had cycles of successes and failures. Because of the possible huge impact of the technology, the successful phases have led to huge hopes which often haven’t been corresponding to real outcomes. Quantum computing in some sense has similar dangers. What is your intuition about a field that combines them both?

I find this an interesting question and I am in a state of transition myself in how I want to approach my research. After spending five years working in the field of QML, what I am learning about myself is that I would like to produce research based on rigorous statements. From a theoretical point of view, in the history of machine learning, it was really hard to generate rigorous theory without insights gain from the success of large numerical studies. On the quantum side, we don’t have large computers to run experiments at scale. Therefore, it is extremely hard to make any rigorous statements in quantum machine learning. Importantly, these considerations constitute one of the main reasons why I want to join QuSoft. Here people’s expertise lie in the field of complexity theory and learning theory which is where I would like to go, since these frameworks offer the possibility of working on rigorous mathematical models.

 

We are very proud to have a new woman in QuSoft, as you know the field lacks gender balance. How was it for you to be a woman in science so far?

My PhD co-supervisor is a woman and a great scientist who works on quantum machine learning. She helped me a lot and helped me overcome any insecurities I had being a sort of minority in the field. I think is very important to have such role models and I hope I can someday do the same for a future student.

 

 

More information

Amira Abbas is a postdoctoral research fellow at QuSoft (UvA). Her postdoctoral fellowship is funded by IBM, as part of a collaboration in the context of the Quantum Application Lab (QAL).

Share this post

- More To Explore -

This website uses cookies to ensure you get the best experience on our website.