A proof of concept adapter to use Google's TensorFlow in place of ROOT's TMVA.
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Entry for Qiskit Camp Asia 2019
A python library for quantum information and manybody calculations including tensor networks.
Scipy library main repository
startedmmlbook/mmlbook.github.io
started time in 22 days
issue openedgoogle/TensorNetwork
Have FiniteMPS.{left, right}_envs use BaseMPS.apply_transfer_operator
I might be missing something, but I believe the contractions in FiniteMPS.left_envs
is identical to BaseMPS.left_transfer_operator
. I suggest the alteration to make use of the latter's JITting.
created time in a month
issue commentgoogle/TensorNetwork
allow center_position=None in FiniteMPS
Would allow for computing left_envs
and right_envs
unconditionally for noncanonical FiniteMPS
s.
comment created time in a month
pull request commentjcmgray/quimb
Hi Johnnie, I'm happy to rename it. I've been slightly busy with other commitments so progress on this PR might be a bit slow, sorry.
comment created time in a month
pull request commentjcmgray/quimb
Yes, I have planned to have the LPS only store one of the two rows. In that way, the shape of the LPS resembles an MPO, but only one side of the open indices are physical indices (to which gates can be applied), and the other side are Kraus indices which may be of different size (c.f. MPOs where both sides are physical indices that should be of matching sizes, possibly permuted). So in a sense, the LPS is closer to an MPS, but with a Kraus index on each tensor, which is why I decided to extend TensorNetwork1DVector
.
Doing this should allow me to reuse the gate_TN_1D
without any major modifications. When a gate is applied to a single sided LPS, the whole density matrix is transformed correctly when contracting over the Kraus indices with its conjugate.
So when dealing with expectation values I also thought that something like
expec_TN_1D(lps, mpo1, mpo2, lps.H)
would just contract the Kraus indices between lps
and lps.H
.
comment created time in 2 months
PR opened jcmgray/quimb
Following our discussion on Gitter, I've begun initial work on LPSs (also known as MPDOs). Still need to implement where NotImplementedError
is raised, and I don't yet know how subsystem lognegativity works for LPSs.
I believe the toplevel module functions and the functions in TensorNetwork1DVector
work without modifications, but obviously tests are still required for basically everything.
pr created time in 2 months
push eventAidanGG/quimb
commit sha 25c25a4a35fd02fe15a5404729d7fcd2b4f5863c
Initial LPS work
push time in 2 months
push eventAidanGG/quimb
commit sha 74f946f7c5aba1055f9b4d45b8b9cde82dbff73d
Use 1D TN contraction for imaginary TEBD test
push time in 2 months
PR opened jcmgray/quimb
For now I just extended the __init__
function with an imag
param. I added a basic test showing it with ZZ interactions.
pr created time in 2 months
push eventAidanGG/quimb
commit sha ecb6a746b0942984649e79a528f5ea5bf2cff34f
Initial implementation of imaginary time TEBD
push time in 2 months
startedcpptaskflow/cpptaskflow
started time in 2 months
fork AidanGG/quimb
A python library for quantum information and manybody calculations including tensor networks.
fork in 2 months
issue openedjcmgray/quimb
Two suggestions for TEBD:

Imaginary time evolution. I propose adding another kwarg to
TEBD.__init__
which controls imaginary time evolution and removes the1.0j
inget_gate
.split_opts
will then have to requirerenorm: True
to maintain normalisation. 
MPO density matrix inputs. By extending to mixed states, unitary evolution can still be performed, but with imaginary time evolution, we can e.g. generate Hamiltonian thermal states represented as MPOs.
I'm going to start working on some PRs to address these.
created time in 2 months