About the entropy growth in random unitary circuit

How do I use this algorithm? What does that parameter do?
Post Reply
Yucheng He
Posts: 4
Joined: 20 May 2022, 00:40

About the entropy growth in random unitary circuit

Post by Yucheng He »

I am trying to simulate the entropy growth in Random Unitary Circuit. I start from an L=8 chain and got this plot.
Each bond is less than ln(2)=0.69, 2*ln(2)=1.39, 3*ln(2)=2.08 and 4*ln(2)=2.77 plateaus at late times.
(I have roughly 0.625, 1.30, 1.87, and 2.24)
What cause this phenomenon? Is that because an ideal maximally mixed state (which is associated with the maximally entangled state) is hard to be created by a total random dynamics?
output.png
output.png (44.76 KiB) Viewed 317 times

Code: Select all

L = 8
spin_half = SpinHalfSite(conserve='None')
psi = MPS.from_product_state([spin_half]*L, ["up", "down"]*(L//2), bc='finite')
print(psi.chi)
TEBD_params = dict(N_steps=1, trunc_params={'chi_max':100})
evo = RandomUnitaryEvolution(psi, TEBD_params)
evo.run()
print(psi.chi)

def measurement(evo, data):
    keys = ['t', 'entropy', 'Sx', 'Sz', 'corr_XX', 'corr_ZZ', 'trunc_err']
    if data is None:
        data = dict([(k, []) for k in keys])
    data['t'].append(evo.evolved_time)
    data['entropy'].append(evo.psi.entanglement_entropy())
    data['Sx'].append(evo.psi.expectation_value('Sigmax'))
    data['Sz'].append(evo.psi.expectation_value('Sigmaz'))
    data['corr_XX'].append(evo.psi.correlation_function('Sigmax', 'Sigmax'))
    data['trunc_err'].append(evo.trunc_err.eps)
    return data

data = measurement(evo, None)
while evo.evolved_time < 30.:
    evo.run()
    measurement(evo, data)

plt.plot(data['t'], np.array(data['entropy']))
plt.xlabel('time $t$')
plt.ylabel('entropy $S$')
User avatar
Johannes
Site Admin
Posts: 337
Joined: 21 Jul 2018, 12:52
Location: TU Munich

Re: About the entropy growth in random unitary circuit

Post by Johannes »

With the random unitary evolution, you get an ensemble of Haar-distributed states, i.e. each normalized state has the same probability. Since there *are* states with less than maximal, but no states with more-than-maximal entropy, it is clear that the average entropy in the Haar distribution is less then the maximum - the only question is by how much.

In 1993, Page did some calculations (10.1103/PhysRevLett.71.1291) showing that the half-chain entropy is very close to maximal, namely \(S_{Haar}(m,n) = log(m) - \frac{m}{2n}\) for Hilbert space dimensions \(m\leq n\). In other words, for the half-chain entropy, the entropy is just \(S_{Haar}(2^{L/2}, 2^{L/2}) = \frac{L}{2}log(2) - \frac{1}{2}\).
Yucheng He
Posts: 4
Joined: 20 May 2022, 00:40

Re: About the entropy growth in random unitary circuit

Post by Yucheng He »

Thank you Johannes, that helps a lot! : )
Post Reply