GithubHelp home page GithubHelp logo

Comments (6)

pldallairedemers avatar pldallairedemers commented on July 20, 2024 1

The overarching theory of impurity methods in condensed matter is self-energy functional theory (SFT):
https://arxiv.org/abs/1108.2183
In all cases at some point in the variational loop we have to integrate over the frequency domain of two-point correlation functions. This can be done either as an integral over the real frequency domain or as a Matsubara sum. In both cases we need to evaluate the correlation functions for thousands of time points over a relatively long time evolution to get a reasonable accuracy, this may be prohibitively expensive for near term devices.
There exists a classical trick to significantly reduce the number of imaginary frequencies one needs to sum to get an accurate result (hundreds of points instead of tens of thousand):
https://doi.org/10.1103/PhysRevB.75.035123
... but then it's not clear how it maps to sampling a smaller set of real time points for the correlation functions.
Unless this problem has been solved in a different context, this would be a critical theory bottleneck for a practical implementation of impurity methods in OpenFermion.

As a side note, there is also a cluster method which is not obviously derived from SFT which works in a discretized momentum space: the dynamical cluster approximation (DCA)
https://arxiv.org/abs/cond-mat/9903273
It would likely suffer the same problem of integrating correlation functions over the frequency domain.

from openfermion.

pldallairedemers avatar pldallairedemers commented on July 20, 2024 1

I didn't know much about DMET, thanks for the link Ryan!

There are two time-dependence notions that have to be distinguished. There is the non-equilibrium situation where the Hamiltonian itself is time-dependent and therefore macroscopic observables also vary in time. There is also the linear response regime where a system is assumed to be at equilibrium under a constant Hamiltonian and a small perturbation at some frequency is introduced. For materials this corresponds to the spectrum of electronic excitations around the Fermi surface where a lot of material engineering happens.

Bare SFT doesn't give information about time dependent quantities which would arise in non-equilibrium situations but it yields the response of a given system at equilibrium when slightly perturbed at different frequencies. If one could directly compute (or sample?) the k'th order time derivative (d/dt)^k of the correlation function around t=0 with some kind of phase estimation scheme then it's indeed possible to recover the frequency dependent correlation functions by doing a moment expansion. However the convergence of the moment expansion and how much accuracy is required for each derivative has to be analyzed carefully since we are inferring information about low energy physics from short time evolution. But then if the correlation functions can be extracted from a quantum subroutine they can be used as inputs in (all?) impurity methods.

From the reference you posted it seems that DMET can also be used for linear response theory but it's not clear if it's as direct as with impurity methods.

from openfermion.

jarrodmcc avatar jarrodmcc commented on July 20, 2024 1

I agree with most of the discussion here that while there are some clear theoretical bottlenecks, the need for these methods at some point means that the community at large would probably benefit from having the building blocks to work with within OpenFermion. It's such a large undertaking however, that spending some time considering independent, reusable pieces from such a project would be worthwhile, so there would be some intermediate benefit to implementing all of this. @pldallairedemers would probably be the expert to consult on this, but just a few pieces might be for example

  1. A standalone classical Green's function / self energy toolkit, or at least interface with major codes (do any open source codes have this?). This is on the border of things we want might to have within OpenFermion, but it's usefulness might outweigh design considerations.
  2. Better frameworks for dealing with translational symmetry in our general routines since these will often be used in those cases.
  3. Some state representations / simulator hookups for mixed state simulation and testing, since in many of these cases we are implicitly interested in Gibbs / thermal states or general density matrices rather than pure states.

The closest functions that exist within OpenFermion right now are active space truncation functions that average out core electrons, but these are a shadow of what you would need for decent embedding methods that allow for fractional occupation of the embedded space. I want to make sure we mention also that @ncrubin recently wrote up a paper on the use of DMET on quantum computers that is relevant to the discussion here:

https://arxiv.org/abs/1610.06910

It's perhaps also worth considering whether we want to lump other partitioning schemes into this umbrella, such as active space + perturbation theory. I imagine there are some shared pieces, but Green's function vs other approaches seem to be different enough that they might need entirely separate modules.

from openfermion.

pldallairedemers avatar pldallairedemers commented on July 20, 2024 1

I'll share some more thoughts and personal observations to build on Jarrod's comments.

On the big picture side of thing I view impurity solvers somewhat as higher level methods that have correlation functions both as inputs and outputs (self-energy can be determined from the correlations functions). From my understanding translation invariance of infinite systems is mostly dealt with at this level (although the full story has many subtleties). I am not aware of a general purpose open source code for impurity solvers but I would be surprised if there is no specialized implementation (e.g. DMFT) somewhere out there. For sure there exists large scale codes in academia but one would have to look at the licensing details on a case by case basis. Figuring out a good standard to precisely specify an interface to the large scale codes shouldn't be too hard, I believe most of this work could be done by simply looking at the formalism of SFT which recovers a lot of impurity methods as limit cases.

I tend to view state preparation and correlation function measurements as two separate modules. Preparing Gibbs states is useful when we want to look at temperature dependent effects like phase transition but it's not an absolute requirement, we can measure correlation functions on pretty much anything (obviously the ground state is often very interesting).

Just for reference, here are two good papers dealing with the preparation of Gibbs state:
A Quantum-Quantum Metropolis Algorithm
Thermalization in Nature and on a Quantum Computer
(An unexplored idea would also be to prepare thermal Gaussian states from a larger purified Gaussian state. I expect there is a deterministic purification procedure for this that can be implemented in linear depth with matchgates).

Now on the problem of establishing specifications for a correlation functions module I'll start from more specific issues and end with more speculative thoughts on what should be considered in a general framework. There is some theoretical work to do in order to get a clear picture of the complete approach.

  1. For basic two-point correlation functions the most obvious approach seems to be an interferometric scheme like the one I presented in
    A method to efficiently simulate the thermodynamical properties of the Fermi-Hubbard model on a quantum computer
    However this circuit is not optimal, the most compact I have seen is the one in
    Non-linear quantum-classical scheme to simulate non-equilibrium strongly correlated fermionic many-body dynamics,
    which also deals with non-equilibrium systems in the form of a time-dependent state preparation.
  2. In both cases we have to call a time evolution module. The way we compute those functions classically is usually by a Matsubara summation (or more refined pole summation methods). I am not aware of any quantum method that can extract these quantities directly but it doesn't mean that it is not possible. On a quantum simulator we usually have access to observables which depend on a real time parameter which we can convert back to the right spectral quantity with a Fourier transform. However we have to be careful about the convergence of the reconstruction method. For example a reconstruction of the spectral function from its moment expansion by measuring time derivative at t=0 would lift the requirement to simulate a long time evolution to extract low energy physics but it could impose strong requirements on the accuracy of the measurements. I would expect this analysis has already been done somewhere in the signal processing literature. Analyzing the trade-offs between sampling time points and calculating the correlation functions classically or directly calculating the derivatives with a phase estimation scheme has to be done.
  3. n-point correlation functions can also be useful quantities depending on the problem being studied. There is a detailed procedure in
    Efficient Quantum Algorithm for Computing n-time Correlation Functions
  4. There is a more measurement based approach where one extracts the same correlation functions from the derivatives of a generating functional.
    Quantum sensors for the generating functional of interacting quantum field theories
    It's meant to be applied to study quantum field theories but it should work in all contexts. Which approach is more useful may depend on the specific implementation of the quantum processor.
  5. Up to now I have only been considering time-ordered correlation functions, but more generally other quantities may be interesting to include in the framework. Notably, out-of-time-ordered correlation functions can be used to study fast scrambling in the context of quantum chaos. It's possible to extract these quantities from an interferometric scheme very similar to the two-point correlation function as detailed in:
    Measuring the scrambling of quantum information
    Digital Quantum Simulation of Minimal AdS/CFT
  6. A wider class of interferometric protocols could potentially be used to define the measurement of topological invariants:
    Observation of topological Uhlmann phases with superconducting qubits
  7. In a more general (and speculative) context, there may exist a large class of correlators that can be used to study various aspects of entanglement in correlated systems. I would start from these references but I don't know how deep this rabbit hole goes:
    Wave Function and Strange Correlator of Short Range Entangled states
    Mapping topological to conformal field theories through strange correlators

The main advantage I see to working toward a general framework is that it would become possible to quickly apply new concepts and methods between different branches of physics. For example, recent ideas in quantum gravity could be used to study and understand new effects in quantum chemistry and potentially discover new classes of phenomena. The main difficulty mostly comes from the facts that quantum simulators will likely transform the workflow traditionally used in classical computations and we may need a few wrappers to interface with older codes.

As a final disclaimer I'll say that I'll be happy to advise on the theory framework but I don't expect to have time to do the actual code implementation myself. I'll try to make myself more familiar with the DMET approach.

from openfermion.

babbush avatar babbush commented on July 20, 2024

Thanks for this helpful overview Pierre-Luc! You outline some nice research directions and give some nice background. Regarding the impracticality of methods in the near-term: I've had several conversations with Bela (the author of https://arxiv.org/abs/1510.03859) who seems to believe that evaluating the correlation functions at just tens of time points would already be enough to enable truly interesting simulations in some cases. Even if it would take many more than this though, OpenFermion is not explicitly a platform for near-term implementations. Many of us (myself for one) are interested in thinking about what to do with an error-corrected device. So in that context I don't see any reason why we wouldn't welcome contributions on this topic to OpenFermion today.

Also, not all embedding schemes necessitate time evolution. For instance, DMET (https://arxiv.org/abs/1603.08443) can be used in cases where one is only interested in time-independent properties of systems in their thermodynamic limits. I have slightly changed the title of this issue because I am not sure if DMET technically falls into the category of an "impurity model", although it is obviously related.

from openfermion.

babbush avatar babbush commented on July 20, 2024

Thanks both Jarrod and Pierre-Luc for the great discussion. I don't think Jarrod or myself will have time to dive into this project at this point. But if another contributor (ideally one with a condensed matter background) wants to work on this in the future, I think Pierre-Luc would be an awesome source of advice / feedback on the implementation.

from openfermion.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.