pasqal-io / qadence-libs Goto Github PK
View Code? Open in Web Editor NEWA collection of libraries to enhance Qadence functionalities.
Home Page: https://pasqal-io.github.io/qadence-libs/latest/
License: Apache License 2.0
A collection of libraries to enhance Qadence functionalities.
Home Page: https://pasqal-io.github.io/qadence-libs/latest/
License: Apache License 2.0
In qadence.blocks.manipulate
there is some functionality to convert qadence to and from openfermion. This is not widely used but has been used afaik by Igor in the past. I was going to deprecate it for removal, but maybe it can move into qadence_libs
.
[ ] Move the relevant files
[ ] Move tests in test_manipulate.py
[ ] Probably remove the ones from test_operators.py
[ ] Move openfermion dependency from qadence to qadence libs
The link in README
to the code of conduct is broken due to a missing file.
Currently, the rydberg_feature_map
here does not pass any value for omega to AnalogRot
which then defaults to 0.
Thus, the Hamiltonian for the Feature map turns out to be something like:
H/h = ∑ᵢ( - δnᵢ) + Hᵢₙₜ.
And with the interaction term being an NN
term, this means the Hamiltonian has no driving term at all. Which would mean, if the state that this block gets is the zero state, then nothing drives the state to other states, and we would be left in the zero state.
Considering this is the feature map, we are pretty likely to start with zero state. Thus, it would fail to encode the feature at all.
Need to provide an appropriate driving term to the feature map.
So far, only two approximations of the Quantum Natural Gradient optimizer are implemented: the exact version and the SPSA approximation. Another commonly used approximation is the block-diagonal approximation (proposed in the seminal paper) where the QFI is approximated via a block-diagonal matrix only taking in account correlations of variational parameters within each circuit block:
The block-diagonal approximation can be efficiently computed by measuring average values of the circuit layers generators. A further approximation that could be implemented is to only keep the diagonal terms.
QNG has been recently added to qinfo_tools
. While it works smoothly with simple training loops, as shown here, it is not compatible with train()
, function defined in qadence
to automate the definition of the training process with gradient-based optimizers.
In particular, the error seems to be related to Python's pickle module having trouble pickling a local object:
Traceback (most recent call last):
File ".../qadence/ml_tools/saveload.py", line 76, in write_checkpoint
torch.save(
File ".../torch/serialization.py", line 628, in save
_save(obj, opened_zipfile, pickle_module, pickle_protocol, _disable_byteorder_record)
File ".../torch/serialization.py", line 840, in _save
pickler.dump(obj)
AttributeError: Can't pickle local object 'Parametric.__init__.<locals>.parse_values'
Some tests in Qadence require constructors. They should either: 1) be moved to libs 2) Simplified in Qadence to avoid circular imports.
@dominikandreasseitz @eduardo to coordinate for the issues on Qadence.
Points to consider:
This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.
This repository currently has no open or pending branches.
.github/workflows/build_docs.yml
actions/checkout v4
actions/setup-python v5
.github/workflows/lint.yml
actions/checkout v4
actions/setup-python v5
.github/workflows/test_fast.yml
actions/checkout v4
actions/setup-python v5
actions/upload-artifact v4
actions/checkout v4
actions/setup-python v5
actions/checkout v4
actions/setup-python v5
pyproject.toml
docs/requirements.txt
Fully move to qadence_libs.ml_tools
.
But put the TransformedModule
in qadence_libs.models
Relates #6
Move to qadence_libs.models
.
Maybe also move the current Overlap
? It inherits from QuantumModel
.
./module
for ml_tools
libs.py
Currently these metrics are used but copied in protocols tests. It would make sense to have them accessible to all packages that require them to run tests.
From qadence.constructors
, the following sub-modules can be moved to qadence-libs.constructors
:
ansatze.py
feature_maps.py
iia.py
qft.py
rydberg_feature_maps.py
rydberg_hea.py
The following sub-modules remain, which I feel like they belong in qadence
. We can discuss about the best place to put them.
daqc
hamiltonian_factory.py
Start a qadence_libs.statistics
for this type of utilities.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.