Hi there,
I am a quantum algorithm researcher at one of the large startups in the field and I have a couple of comments, one to back up the conclusion on ML for DFT, and another to push back a bit on the quantum computing end.
For the ML for DFT, one and a half years ago we tried (code here) to replicate and extend the DM21 work, and despite some hard work we failed to get good accuracy training ML functionals. Now, this could be because I was dumb or lacked abundant data or computation, but we mostly concluded that it was unclear how to make ML-based functionals work.
On the other hand, I feel this paragraph is a bit less evidence-based.
Quantum computing was basically laughed off as overhyped and useless. M said no effect on the field “in my lifetime” (he’s like, 50). N said it was “far away”, and G said it was “hype”, and that it is “absolutely not true” that QC material science will happen soon. They were very disdainful, and this matches what I’ve heard through the grapevine of the wider physics community: there is a large expectation that quantum computing is in a bubble that will soon burst.
I think there are genuine reasons to believe QC can become a pretty useful tool once we figure out how to build large-scale fault-tolerant quantum computers. In contrast to logistics, finance, optimization, etc which are poor target areas for quantum computing, material science is where (fault-tolerant) quantum computing could shine brightest. The key reason is that we could numerically integrate the Schrodinger equation to large system sizes with polynomial scaling in the system size and polylogarithmic cost in the (guaranteed) precision, without the vast majority of the approximations needed in classical methods. I would perhaps argue that some of the following represent roughly the state of the art on the quantum algorithms we may be able to run:
The takeaway of these papers is that with a few thousand logical qubits and logical gates that run at MHz (something that people in the field believe to be reasonable), it may be possible to simulate relatively large correlated systems with high accuracy in times of the order of days. Now, there are of course very important limitations. First and foremost, you need some rough approximation to the ground state that we can prepare (here, here) and project with quantum computing methods. This limits the size of the system that we can model because there is a dependence on classical methods, but it extends the range of accurate simulations efficiently.
Second, as noted in the post, classical methods are pretty good are modeling ground state. Thus, it makes sense to focus most of the quantum computing efforts on modeling strongly correlated systems, excited states, or dynamic processes involving light-matter interaction and the sort. I would argue we still have not found good ways to go beyond the Born-Oppenheimer approximation though, except if you are willing to model everything (nuclei, electrons) in plane waves and first quantization, which is feasible but may make the simulation perhaps one or two orders of magnitude more costly.
This is all assuming fault-tolerant quantum computing. I can’t say much on the timelines though because I am an algorithmic researcher so I do not have a very good understanding of the hardware challenges, but I would not find it unsurprising to see companies building fault-tolerant quantum computers with hundreds of logical qubits in 5 to 15 years from now. For example, people have been making good progress and Google recently showed the first experiment where they can reliably reduce the error with quantum error correction. The next step for them is to build a logical qubit that can be corrected for arbitrary time scales.
Overall, I think the field of fault-tolerant quantum computing is putting forward solid science, and it would be overly dismissive to say it is just hype, or a bubble.
Hey, thanks for weighing in, those seem like interesting papers and I’ll give them a read through.
To be clear, I have very little experience in quantum computing, and haven’t looked into it that much and so I don’t feel qualified to comment on it myself (hence why this was just an aside there). All I am doing is relaying the views of prominent professors in my field, who feel very strongly that it is overhyped and were willing to say so in the panel, although I do not recall them giving much detail on why they felt that way. This matches with the general views I’ve had with other physicists in casual conversations. If I had to guess the source of these views, I’d say it was skepticism of the ability to actually build such large scale fault-tolerant systems.
Obviously this is not strong evidence and should not be taken as such.
I agree there is certainly quite a lot of hype, though when people want to hype quantum they usually target AI or something. My comment was echoing that quantum computing for material science (and also chemistry) might be the one application where there is good quality science being made. There are also significantly less useful papers, for example those related to “NISQ” (non-error-corrected) devices, but I would argue the QC community is doing a good job at focusing on the important problems, not just hyping around.
Hi there, I am a quantum algorithm researcher at one of the large startups in the field and I have a couple of comments, one to back up the conclusion on ML for DFT, and another to push back a bit on the quantum computing end.
For the ML for DFT, one and a half years ago we tried (code here) to replicate and extend the DM21 work, and despite some hard work we failed to get good accuracy training ML functionals. Now, this could be because I was dumb or lacked abundant data or computation, but we mostly concluded that it was unclear how to make ML-based functionals work.
On the other hand, I feel this paragraph is a bit less evidence-based.
I think there are genuine reasons to believe QC can become a pretty useful tool once we figure out how to build large-scale fault-tolerant quantum computers. In contrast to logistics, finance, optimization, etc which are poor target areas for quantum computing, material science is where (fault-tolerant) quantum computing could shine brightest. The key reason is that we could numerically integrate the Schrodinger equation to large system sizes with polynomial scaling in the system size and polylogarithmic cost in the (guaranteed) precision, without the vast majority of the approximations needed in classical methods. I would perhaps argue that some of the following represent roughly the state of the art on the quantum algorithms we may be able to run:
Quantum simulation of realistic materials in first quantization using non-local pseudopotentials.
Faster quantum chemistry simulations on a quantum computer with improved tensor factorization and active volume compilation.
Quantum simulation of exact electron dynamics can be more efficient than classical mean-field methods
The takeaway of these papers is that with a few thousand logical qubits and logical gates that run at MHz (something that people in the field believe to be reasonable), it may be possible to simulate relatively large correlated systems with high accuracy in times of the order of days. Now, there are of course very important limitations. First and foremost, you need some rough approximation to the ground state that we can prepare (here, here) and project with quantum computing methods. This limits the size of the system that we can model because there is a dependence on classical methods, but it extends the range of accurate simulations efficiently.
Second, as noted in the post, classical methods are pretty good are modeling ground state. Thus, it makes sense to focus most of the quantum computing efforts on modeling strongly correlated systems, excited states, or dynamic processes involving light-matter interaction and the sort. I would argue we still have not found good ways to go beyond the Born-Oppenheimer approximation though, except if you are willing to model everything (nuclei, electrons) in plane waves and first quantization, which is feasible but may make the simulation perhaps one or two orders of magnitude more costly.
This is all assuming fault-tolerant quantum computing. I can’t say much on the timelines though because I am an algorithmic researcher so I do not have a very good understanding of the hardware challenges, but I would not find it unsurprising to see companies building fault-tolerant quantum computers with hundreds of logical qubits in 5 to 15 years from now. For example, people have been making good progress and Google recently showed the first experiment where they can reliably reduce the error with quantum error correction. The next step for them is to build a logical qubit that can be corrected for arbitrary time scales.
Overall, I think the field of fault-tolerant quantum computing is putting forward solid science, and it would be overly dismissive to say it is just hype, or a bubble.
Hey, thanks for weighing in, those seem like interesting papers and I’ll give them a read through.
To be clear, I have very little experience in quantum computing, and haven’t looked into it that much and so I don’t feel qualified to comment on it myself (hence why this was just an aside there). All I am doing is relaying the views of prominent professors in my field, who feel very strongly that it is overhyped and were willing to say so in the panel, although I do not recall them giving much detail on why they felt that way. This matches with the general views I’ve had with other physicists in casual conversations. If I had to guess the source of these views, I’d say it was skepticism of the ability to actually build such large scale fault-tolerant systems.
Obviously this is not strong evidence and should not be taken as such.
I agree there is certainly quite a lot of hype, though when people want to hype quantum they usually target AI or something. My comment was echoing that quantum computing for material science (and also chemistry) might be the one application where there is good quality science being made. There are also significantly less useful papers, for example those related to “NISQ” (non-error-corrected) devices, but I would argue the QC community is doing a good job at focusing on the important problems, not just hyping around.