Quantum error correction and entanglement spectrum in tensor networks

A sort of planar tensor networks with tensor constraints is investigated as a model for holography. We study the greedy algorithm generated by tensor constraints and propose the notion of critical protection (CP) against the action of greedy algorithm. For given tensor constraints, a CP tensor chain can be defined. We further find that the ability of quantum error correction (QEC), the non-flatness of entanglement spectrum (ES) and the correlation function can be quantitatively evaluated by the geometric structure of CP tensor chain. Four classes of tensor networks with different properties of entanglement is discussed. Thanks to tensor constraints and CP, the correlation function is reduced into a bracket of Matrix Production State and the result agrees with the one in conformal field theory.

Introduction.-Quantumentanglement plays a key role in understanding the structure of spacetime from the emergent point of view [1,2].The Ryu-Takayanagi (RT) formula links the entanglement entropy of a subsystem on the boundary to the area of the minimal homological surface in the bulk [3].Such an approach has been recently generalized to construct the gravitational dual of Renyi entropy [4], which provides a correspondence of entanglement spectrum (ES) between the bulk and the boundary.In particular, for the vacuum in AdS 3 /CF T 2 correspondence, Renyi entropy satisfies Cardy-Calabrese formula and a non-flat ES is inherent [5,6].Another remarkable feature of AdS space is the subsystem duality, which states that a local operator in the bulk can be reconstructed in a subsystem A on the boundary if it is located within the entanglement wedge of A [7][8][9][10][11][12][13]. It can be be viewed as the accomplishment of Quantum Error Correction (QEC) in quantum information [8,[13][14][15].Moreover, it is found that RT formula can be derived from QEC [16].
It has been revealed that tensor networks provide a geometric picture for entanglement renormalization such that holographic spaces may emerge from the entanglement of a many-body system [17][18][19], gearing up the exploration on the deep relation between tensor networks and the structure of spacetime [20,21].One typical kind of tensor networks is the multiscale entanglement renormalization ansatz (MERA), which respects RT formula, exhibiting logarithmic law of entanglement entropy and non-flat entanglement spectrum as the AdS vacuum [17-19, 22, 23].However, MERA breaks the isometry group SL(2, R) and has a preferred direction, implying that QEC can not be realized along all directions.On the other hand, perfect tensors, which are also called as holographic codes, take the advantages of implementing QEC over a H 2 space [15,24,25].Unfortunately, it is found that such kind of tensor networks has a flat ES and trivial connected correlation functions, which evidently is not a reflection of the holographic property of AdS spacetime [26,27].In random tensor networks and spin networks, all the orders of Renyi entropy for the ground state share the same RT formula, leading to a flat ES as well [28][29][30][31].
The attempt to recover the result of Cardy-Calabrese formula of Renyi entropy can be found in [32] where the bulk dynamics is taken into account.
Recently a new class of tensor networks which is named hyperinvariant tensor networks has been constructed in [27], which retains the advantages of both MERA and perfect tensor networks.The key ingredient of hyperinvariant tensor networks is to impose multi-tensor constraints, which demand certain product of multiple tensors to form an isometric mapping.Remarkably, this sort of tensor networks can not only accomplish QEC as perfect tensors, but also generate non-flat ES as MERA, thus qualitatively capturing both holographic features of AdS spacetime.
Nevertheless, some key issues remain unanswered in this approach.First of all, the holographic property of tensor networks depends on the specific structure of tensor constraints.What kind of multi-tensor constraints could endow desirable features of AdS spacetime to a given tensor network?More importantly, to accomplish the holographic features of tensor networks one always faces a dilemma: once the ability of QEC of a tensor network becomes stronger, then more easily its ES becomes flat, and vice versa.Is there any criteria to justify the ability of QEC and the non-flatness of ES for a tensor network with given constraints?In this letter we provide affirmative answers to above issues.
For this purpose, we will construct tensor networks by tiling H 2 space with identical polygons, and then impose tensor constraints with the notion of tensor chain, which leads to a generalized description of greedy algorithm.We will investigate QEC and ES by manipulating tensor networks.Moreover, we will propose the notion of critical protection (CP) to describe the behavior of tensor networks under the greedy algorithm.Finally, we will develop CP reduced interior angle κ c to quantify the critical protection and classify tensor networks according to their ability of QEC as well as the non-flatness of ES.
Tensor chains in a tensor network.-Wediscretize H 2 space uniformly by gluing identical polygons composed of b edges, with a edges sharing the same node.We call such discretization as the {b, a} tiling of H 2 space.Since the sum of interior angles of a triangle in a space with negative curvature must be less than 2π, a {b, a} tiling of H 2 space can be realized only if 1  a + 1 b < 1 2 .A tensor network can be constructed based on each {b, a} tiling, as illustrated in Fig. 1.Associated with each node, we define a tensor T with a indexes, each of which is specified to an edge jointed at the node respectively.Associated with each edge, we define a tensor E with 2 indexes.Because of the rotational invariance of H 2 space, we demand that the indexes of tensor T and E have cyclic symmetry Consider a tensor network Ψ, and let all the indexes of tensors T contract with those of tensors E such that all uncontracted indexes belong to tensors E only.Corresponding to such a network, we define a state |Ψ in the Hilbert space on those uncontracted edges.By dissecting a tensor network, as in Fig. 1, we define a key object called tensor chain M , whose general form is shown in Fig. 2. Vividly, the uncontracted edges in M are split into the upper part A and lower part B. So we denote its elements as M A B .The number of edges at each node satisfies m i + n i = a − 2 + δ i1 + δ ik , where i is the sequence number labelling the node of tensor chain.Specifically, for the tensor chain in Fig. 1, Vice versa, a tensor chain can be mapped into the tiling of H 2 space and its skeleton forms a directed polyline in the network, where along the direction of the polyline the sequence number i increases and the upper (lower) edges are placed on the left (right) hand side of the polyline.To describe the curvature of its corresponding polyline, we define the average reduced interior angle of a tensor chain as where "reduced" means that we have taken 2π a as the unit of interior angles.
Tensor constraints and critical protection (CP).-Withoutloss of generality, we will focus on the tensor network with {5, 4} tiling as a typical example to disclose the structure of the tensor chain which is critically protected under the action of greedy algorithm.Our analysis and results for general tensor networks with {a, b} tiling will be given later.
We define tensor constraints as follows.Besides the cyclic symmetry, we further impose constraints on rank-4 tensor T (orange square) and rank-2 tensor E (blue circle), such that the tensor chains .
(3) are proportional to isometries from the Hilbert space on upper edges to the Hilbert space on lower edges.For the simple case as illustrated in (3), each of tensor chains only involves a single tensor T .One can derive other tensor chains proportional to isometries as well from the tensor constraints.In this letter, we further require that any tensor chain which is proportional to isometry can be derived from tensor constraints, which restricts the structure of tensor T and E. See Supplemental Materials (SM) A and B for details.Greedy algorithm can be generated by tensor constraints.For a tensor chain M , greedy algorithm is the process of simplifying B M A B (M C B ) * according to tensor constraints, which is equivalent to the description in [15] with a single-interval (See SM.C for details).We define that a tensor chain M is unprotected if it can be simplified under the action of greedy algorithm.Otherwise, say it is protected.
Generally speaking, when the tiling and tensor constraints are given, the larger κ is, the easier a tensor chain becomes unprotected.The protected and endless M with largest κ is called critically protected (CP) tensor chain M c .Equivalently, one can check that M c would become unprotected once the list of its n i are rearranged or increased.Its κ is called CP reduced interior angle κ c .
Under the greedy algorithm generated by (3), if ∃i s.t.n i > 1, then M is unprotected.So CP tensor chain has the form as plotted on the right hand side of ( 4) Here we have also presented a scheme to figure out CP tensor chain by a manipulation on the second constraint in (3).The skeleton of such a CP tensor chain forms a polyline in the tensor network, as shown in Fig. 3.For (4), κ c = 2/1 = 2. Definitely, we may impose other tensor constraints, for instance, by requiring the following tensor chains to be proportional to isometries. .
Similarly, we require that any tensor chain which is proportional to isometry can be derived from (5) (See SM.B).
We deduce the CP tensor chain M c as follows.From the first constraint in (5), we know the number of lower edges at any node in CP tensor chain should be smaller than three; while from the second constraint, we know any two nodes with two lower edges can not be neighbored (otherwise they would be swallowed by the constraint).Therefore, M c has the form as plotted on the right hand side of ( 6) Similarly, one can construct M c based on the second constraint in (5), as demonstrated in (6).The corresponding polyline in the network is marked in Fig. 4. The CP reduced interior angle is QEC and ES.-Throughout this letter we will only consider the QEC by inserting an operator into the inter bonds, for instance, .
By virtue of tensor constraints, one can push an operator O 'through' tensor chains in (3) and turn into an operator O , namely , , where the conjugation of tensors are marked in dark colors.Then one can realize the algorithm of QEC, as shown in Fig. 3. Actually, pushing an operator to an interval Ā on the boundary is the inverse of the greedy algorithm beginning at Ā.So any operator inserted outside the CP tensor chain can be pushed to the boundary.
Next we consider the ES of the reduced density matrix ρ A = Tr Ā|Ψ Ψ| = ΨΨ † , where Ā is contracted in the matrix production of tensor networks Ψ and Ψ † .ρ A has a flat ES if all the non-zero eigenvalues are identical.From the diagonalization of ρ A , we know that the flatness of ES is equivalent to which is also equivalent to state that all the orders of Renyi entropy are equal.If no tensor survives under the greedy algorithm starting from A and Ā respectively, then the relation in (10) holds and leads to a flat ES.Otherwise the ES is generally non-flat [37] .We show the result of the greedy algorithm acting on the tensor network with constraints (3) in Fig. 3 [38] .It indicates that the ES is flat, which coincides with the results in [26].
While for the tensor network with constraints (5), the ES is non-flat as shown in Fig. 4. At the same time, we point out that the ability of QEC in this network is weakened in comparison with that in the network with (3), because the operator inserted into the region enclosed by CP tensor chains will approach the endpoints of A during the pushing process.Such phenomenon may be related to the approximate QEC [8,33].More cases of tensor constraints.-Ourstrategy is applicable to other tensor constraints constructed by tensor chains.
• The tensor network with single constraint is plotted in Fig. 5. Irrespective of the interval A one picks out on the boundary, all the tensors survive under the greedy algorithm.The CP tensor chain is closed and we always obtain a non-flat ES.On the other hand, wherever an operator is inserted in the bulk, it can not be pushed to the boundary with the use of the isometry.So such a tensor network does not enjoy QEC.• The tensor network with the constraint composed of three T tensors is plotted in Fig. 6.Given an interval A on the boundary, an operator inserted in the wedge of A can be pushed to A. So such a tensor network enjoys QEC.While, it is subtle to justify whether the ES is flat or not.We find both flat and non-flat ES can be obtained, which depends on the specific choice of the interval A, as shown in Fig. 6.So we call this tensor network has a mixed ES.
The classification of tensor networks.-Nowwe generalize above analysis to tensor networks with general {b, a} tiling and general tensor constraints imposed by requiring that some tensor chains (11) should be proportional to isometries, where the dots refer to one or multiple tensor chains, each of which satisfies 1 ≤ κ ≤ a/2.In [34], we prove that the CP tensor chain and κ c can still be defined uniquely.Given a {b, a} tiling, we find that the larger κ c is, the stronger is the ability of QEC, but the ES more easily becomes flat.Furthermore, sufficiently large tensor networks with general construction of tensor T and E can be classified into four types, as shown in Fig. 7, where three bounds are given by (12) In particular, for {5, 4} tiling one has κ h = 1.42, κ 1 = 5/3, κ 0 = 2.The detailed proof will be given in [34].
Here we just present our conclusions.
1 ≤ κ c ≤ κ h .The network is not able to implement QEC but has non-flat ES, e.g. the tensor network in Fig. 5. κ h < κ c < κ 1 .The network can implement QEC and has non-flat ES, e.g. the tensor network in Fig. 4.
The ability of QEC becomes stronger but the ES becomes "mixed", in the sense that both of flat ES and non-flat ES may appear for different choices of the interval on the boundary, e.g. the tensor network in Fig. 6.
The quality of QEC becomes better but the ES has to be flat, which is exactly the property of most networks composed of perfect tensors, e.g. the tensor network in Fig. 3.

Conclusion and
Outlook.-In this letter the notion of critical protection based on tensor chain has been proposed to describe the behavior of tensor networks under the action of greedy algorithm.In particular, a criteria has been developed with the help of the average reduced interior angle of CP chain such that for a given tensor network the ability of QEC and the flatness of ES can be justified in a quantitative manner.Currently it is still challenging to construct tensor networks which could capture all the holographic features of AdS spacetime.What we have found in this letter has shed light on this issue.Firstly, we have learned that the notion of critical protection provides a description on the limit of information transmission with full fidelity.So CP tensor chain is the maximal boundary which can holographically store the interior information [35,36].Thus, for a tensor network which captures the feature of QEC as AdS space, it must not contain circular CP curves, i.e. κ c ≤ κ h .Furthermore, to construct a single tensor network which exhibits both QEC and non-flat ES, we conclude that the tensor networks with κ c ∈ (κ h , κ 1 ) have more likelihood to mimic the AdS holography.
The geometric description of CP tensor chain is appealing.In the light of its periodic structure, we find the analogy of CP tensor chain is the curve of constant curvature in H 2 space such that κ c is related to the geodesic curvature of the curve [34].Specifically, an open CP tensor chain corresponds to a hypercircle, which has a constant distance from its axes (a geodesic), as illustrated in Fig. 4.Such a distance measures the deviation from RT formula when evaluating the Renyi entropy, which may be linked to the tension of cosmic brane in [4].
Because of the chain structure of tensor constraint, in our present framework we have investigated QEC and ES only for a single interval on the boundary.It is an open

SUPPLEMENTAL MATERIAL
A. Specific construction of tensors subject to tensor constraints In Fig. 8, 9 and 10, we define tensor U , tensor Q and tensor R as the building blocks for T and E. The elements of tensor U are U µν .They satisfy following relations The elements of tensor Q are Q µνρσ where two indexes µν (ρσ) are grouped together.They satisfy The elements of tensor R are R µνρσ .They satisfy Specifically, we construct the tensor T and tensor E for the tensor network with {5, 4} tiling for different tensor constraints, as shown in Fig. 11,12,13 and 14. Specific elements of some tensors Q and R are given in [27].

B. Tensor chains which are proportional to isometries
Employing tensor constraints, one can derive other tensor chains which are proportional to isometries.For instance, from (3), the following tensor chains are proportional to isometries. .
Similarly, from (5), the following tensor chains are proportional to isometries. .
The detailed analysis is given in [34].Here we just argue that all of these tensor chains form a set S M with infinite number of elements, and satisfy κ ≤ κ c .We stress that one should take all these tensor chains into account when justifying whether the contraction of tensor product could be simplified under the action of greedy algorithm.
On the other hand, it is important to require that those tensor chains which do not belong to the set S M should not be propositional to isometries, which prevents tensor T and E from trivial structure, for instance, the outer product of identity matrices.We point out that many tensor chains do not belong to S M , such as the following tensor chains for constraints (3) , and following tensor chains for constraints (5). .
C. Greedy algorithm generated by tensor constraints We firstly review the greedy algorithm on a tensor network following the description in [15], which provides an intuitive way to figure out the region in which the corresponding sub tensor network must be an isometry.Beginning with an interval A on the boundary of a tensor network Ψ, we consider a sequence of cuts {C n }, each of which is bounded by ∂A and obtained from the previous one by a local move on the lattice.The corresponding sub tensor networks also form a sequence of {Φ n }, where Φ n consists of those tensors between A and C n .Let C 1 = A and Φ 1 is an identity.For perfect tensors, at each step one figures out a tensor M n which has at least half of its legs contracted with Φ n and construct Φ n+1 by adding M n to Φ n such that Φ n+1 must be an isometry as well.
The procedure stops when one fails to add such tensors to the sequence.
We can generalize the above description by replacing its single tensor M n by a tensor chain M n which is proportional to isometry, with lower edges contracted with Φ n .According to SM.B, those tensor chains which are proportional to isometries form a set S M derived from tensor constraints.Furthermore, in our scenario the target is a tensor chain M rather than a tensor network Ψ.Given a tensor chain M , we simplify the contraction B M A B (M C B ) * subject to tensor constraints.For example, according to (3), we can simplify the contraction where the conjugation of tensors are marked in dark colors.Similarly, according to (5), we can simplify the contraction Actually, the above description of greedy algorithm is equivalent to the description in [15] for a single-interval.At each step from Φ n to Φ n+1 , a M n ∈ S M is used to simplify a tensor chain M .For example, the procedure of simplifying (20) corresponds to the step of extending the shaded region as illustrated in Fig. 15, where the corresponding tensors are enclosed by dashed line in red.Similarly, the process of simplifying (21) corresponds to those steps in Fig. 16.

FIG. 1 :
FIG. 1: A tensor network with {5, 4} tiling.Enclosed by the dashed line is an example of tensor chain.Its skeleton forms a directed polyline which is marked in red.FIG.2: A general form of tensor chain.

FIG. 3 :
FIG. 3: The tensor network with {5, 4} tiling and tensor constraints (3).Those tensors within the shadow region with purple (red) stripes do not survive under the greedy algorithm beginning at A ( Ā).The CP tensor chain is marked by a solid line in red.An operator O in the bulk is pushed to a subinterval of Ā. FIG.4: The tensor network with {5, 4} tiling and tensor constraints (5).The tensors in the blank region survive.Two CP tensor chains correspond to thick polylines in purple and red, respectively.An operator O enclosed by the CP tensor chain is pushed to a region within Ā on the boundary, where blue edges in the shape of rod indicate the employment of the second constraint in (5).

FIG. 7 :
FIG. 7: With κc, we classify tensor networks according to their properties of QEC and ES, where FES means flat ES.

[
34] Y. Ling, Y. Liu, Z. Y. Xian and Y. Xiao "Tensor chain and tensor constraint in tensor network," To appear.[35] S. T. Flammia, J. Haah, M. J. Kastoryano and I. H. Kim, "Limits on the storage of quantum information in a volume of space," Quantum 1, 4 (2017) [arXiv:1610.06169[quant-ph]].[36] T. Jacobson, "Entanglement Equilibrium and the Einstein Equation," Phys.Rev. Lett.116, no.20, 201101 (2016) [arXiv:1505.04753[gr-qc]].[37] When there are tensors surviving under the greedy algorithm, although we can not exclude the tiny possibility that (10) happens to be valid for some constructions of tensor T and E under fine-tuning, we still call that the ES is non-flat for a general construction of tensor T and E. [38] One may notice that CP tensor chain Mc itself falls into the shadow region, implying that it is not survived under the greedy algorithm.This results from the boundary effect in a network with finite layers, where besides the lower edges of Mc, the edges at the end of Mc need to be contracted as well.The boundary effect of greedy algorithm is investigated with details in [34].Here we just remark that this effect is very limited, only swallowing finite layers (usually only one layer) of tensors enclosed by Mc.

FIG. 9 :
FIG. 9: (a) Tensor Q, where two indexes on each side are grouped together.(b) Tensor Q is proportional to an isometry between two grouped indexes.